[ 513.418043] env[67770]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 514.043225] env[67820]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 515.389878] env[67820]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67820) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 515.390253] env[67820]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67820) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 515.390427] env[67820]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67820) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 515.390757] env[67820]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 515.600881] env[67820]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67820) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 515.611209] env[67820]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=67820) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 515.713418] env[67820]: INFO nova.virt.driver [None req-39d7552c-1e6d-471d-81a6-9a6adc357325 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 515.785382] env[67820]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 515.785537] env[67820]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 515.785642] env[67820]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67820) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 518.669156] env[67820]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-c730f60c-6dda-4254-9bd9-0684f13c6c7a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.685653] env[67820]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67820) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 518.685794] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-ec067830-9adb-45aa-a717-af6c3cebb9a4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.712788] env[67820]: INFO oslo_vmware.api [-] Successfully established new session; session ID is b66c1. [ 518.712913] env[67820]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.927s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 518.713476] env[67820]: INFO nova.virt.vmwareapi.driver [None req-39d7552c-1e6d-471d-81a6-9a6adc357325 None None] VMware vCenter version: 7.0.3 [ 518.716962] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5a03c83-b964-4528-b65e-70b0df5ff98d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.735128] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42b0a59f-b00b-4ae2-8c81-e45c7ca9677e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.740968] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4230183-3d24-4e70-94b4-468088185e25 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.747406] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35c71314-5bb8-4af1-81dc-99d8c5f98746 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.760387] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6302bfdc-8d4f-4c6d-ab1d-fcbcf775f3ac {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.766080] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075fad78-d40e-4e8b-a89a-310f53ef70e1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.797084] env[67820]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-cbb95f37-1756-4977-b341-c0ba4b1a234b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 518.802594] env[67820]: DEBUG nova.virt.vmwareapi.driver [None req-39d7552c-1e6d-471d-81a6-9a6adc357325 None None] Extension org.openstack.compute already exists. {{(pid=67820) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 518.805262] env[67820]: INFO nova.compute.provider_config [None req-39d7552c-1e6d-471d-81a6-9a6adc357325 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 518.832307] env[67820]: DEBUG nova.context [None req-39d7552c-1e6d-471d-81a6-9a6adc357325 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),6fe3a8b0-4da6-4759-9486-5c2ccfaded2d(cell1) {{(pid=67820) load_cells /opt/stack/nova/nova/context.py:464}} [ 518.834314] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 518.834537] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 518.835253] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 518.835668] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Acquiring lock "6fe3a8b0-4da6-4759-9486-5c2ccfaded2d" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 518.835854] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Lock "6fe3a8b0-4da6-4759-9486-5c2ccfaded2d" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 518.836854] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Lock "6fe3a8b0-4da6-4759-9486-5c2ccfaded2d" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 518.861569] env[67820]: INFO dbcounter [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Registered counter for database nova_cell0 [ 518.870043] env[67820]: INFO dbcounter [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Registered counter for database nova_cell1 [ 518.873067] env[67820]: DEBUG oslo_db.sqlalchemy.engines [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67820) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 518.873426] env[67820]: DEBUG oslo_db.sqlalchemy.engines [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67820) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 518.877987] env[67820]: DEBUG dbcounter [-] [67820] Writer thread running {{(pid=67820) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 518.878740] env[67820]: DEBUG dbcounter [-] [67820] Writer thread running {{(pid=67820) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 518.880886] env[67820]: ERROR nova.db.main.api [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 518.880886] env[67820]: result = function(*args, **kwargs) [ 518.880886] env[67820]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 518.880886] env[67820]: return func(*args, **kwargs) [ 518.880886] env[67820]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 518.880886] env[67820]: result = fn(*args, **kwargs) [ 518.880886] env[67820]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 518.880886] env[67820]: return f(*args, **kwargs) [ 518.880886] env[67820]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 518.880886] env[67820]: return db.service_get_minimum_version(context, binaries) [ 518.880886] env[67820]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 518.880886] env[67820]: _check_db_access() [ 518.880886] env[67820]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 518.880886] env[67820]: stacktrace = ''.join(traceback.format_stack()) [ 518.880886] env[67820]: [ 518.881920] env[67820]: ERROR nova.db.main.api [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 518.881920] env[67820]: result = function(*args, **kwargs) [ 518.881920] env[67820]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 518.881920] env[67820]: return func(*args, **kwargs) [ 518.881920] env[67820]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 518.881920] env[67820]: result = fn(*args, **kwargs) [ 518.881920] env[67820]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 518.881920] env[67820]: return f(*args, **kwargs) [ 518.881920] env[67820]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 518.881920] env[67820]: return db.service_get_minimum_version(context, binaries) [ 518.881920] env[67820]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 518.881920] env[67820]: _check_db_access() [ 518.881920] env[67820]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 518.881920] env[67820]: stacktrace = ''.join(traceback.format_stack()) [ 518.881920] env[67820]: [ 518.882317] env[67820]: WARNING nova.objects.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Failed to get minimum service version for cell 6fe3a8b0-4da6-4759-9486-5c2ccfaded2d [ 518.882438] env[67820]: WARNING nova.objects.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 518.882895] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Acquiring lock "singleton_lock" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 518.883018] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Acquired lock "singleton_lock" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 518.883273] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Releasing lock "singleton_lock" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 518.883588] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Full set of CONF: {{(pid=67820) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 518.883727] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ******************************************************************************** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 518.883903] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] Configuration options gathered from: {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 518.884086] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 518.884287] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 518.884414] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ================================================================================ {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 518.884621] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] allow_resize_to_same_host = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.884788] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] arq_binding_timeout = 300 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.884916] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] backdoor_port = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.885052] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] backdoor_socket = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.885220] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] block_device_allocate_retries = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.885384] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] block_device_allocate_retries_interval = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.885551] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cert = self.pem {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.885713] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.885879] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute_monitors = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.886078] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] config_dir = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.886261] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] config_drive_format = iso9660 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.886393] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.886556] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] config_source = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.886718] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] console_host = devstack {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.886881] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] control_exchange = nova {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.887056] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cpu_allocation_ratio = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.887219] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] daemon = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.887388] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] debug = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.887542] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] default_access_ip_network_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.887703] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] default_availability_zone = nova {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.887855] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] default_ephemeral_format = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.888016] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] default_green_pool_size = 1000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.888258] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.888445] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] default_schedule_zone = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.888608] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] disk_allocation_ratio = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.888769] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] enable_new_services = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.888946] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] enabled_apis = ['osapi_compute'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.889124] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] enabled_ssl_apis = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.889285] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] flat_injected = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.889440] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] force_config_drive = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.889597] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] force_raw_images = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.889762] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] graceful_shutdown_timeout = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.889910] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] heal_instance_info_cache_interval = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.890138] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] host = cpu-1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.890312] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.890474] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] initial_disk_allocation_ratio = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.890633] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] initial_ram_allocation_ratio = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.890840] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.890999] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instance_build_timeout = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.891173] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instance_delete_interval = 300 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.891336] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instance_format = [instance: %(uuid)s] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.891495] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instance_name_template = instance-%08x {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.891651] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instance_usage_audit = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.891814] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instance_usage_audit_period = month {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.891974] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.892154] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] instances_path = /opt/stack/data/nova/instances {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.892316] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] internal_service_availability_zone = internal {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.892468] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] key = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.892621] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] live_migration_retry_count = 30 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.892781] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_config_append = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.892940] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.893110] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_dir = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.893271] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.893398] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_options = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.893557] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_rotate_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.893722] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_rotate_interval_type = days {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.893885] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] log_rotation_type = none {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.894035] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.894184] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.894355] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.894520] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.894646] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.894804] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] long_rpc_timeout = 1800 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.894959] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] max_concurrent_builds = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.895131] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] max_concurrent_live_migrations = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.895287] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] max_concurrent_snapshots = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.895440] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] max_local_block_devices = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.895593] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] max_logfile_count = 30 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.895744] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] max_logfile_size_mb = 200 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.895897] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] maximum_instance_delete_attempts = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.896101] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] metadata_listen = 0.0.0.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.896286] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] metadata_listen_port = 8775 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.896454] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] metadata_workers = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.896611] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] migrate_max_retries = -1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.896773] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] mkisofs_cmd = genisoimage {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.896976] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] my_block_storage_ip = 10.180.1.21 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.897117] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] my_ip = 10.180.1.21 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.897279] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] network_allocate_retries = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.897451] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.897613] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] osapi_compute_listen = 0.0.0.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.897770] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] osapi_compute_listen_port = 8774 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.897934] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] osapi_compute_unique_server_name_scope = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.898113] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] osapi_compute_workers = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.898274] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] password_length = 12 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.898466] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] periodic_enable = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.898633] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] periodic_fuzzy_delay = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.898803] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] pointer_model = usbtablet {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.898970] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] preallocate_images = none {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.899145] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] publish_errors = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.899277] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] pybasedir = /opt/stack/nova {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.899433] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ram_allocation_ratio = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.899590] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rate_limit_burst = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.899754] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rate_limit_except_level = CRITICAL {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.899911] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rate_limit_interval = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.900080] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] reboot_timeout = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.900244] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] reclaim_instance_interval = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.900398] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] record = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.900565] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] reimage_timeout_per_gb = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.900728] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] report_interval = 120 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.900885] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rescue_timeout = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.901051] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] reserved_host_cpus = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.901214] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] reserved_host_disk_mb = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.901367] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] reserved_host_memory_mb = 512 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.901524] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] reserved_huge_pages = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.901679] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] resize_confirm_window = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.901834] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] resize_fs_using_block_device = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.901990] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] resume_guests_state_on_host_boot = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.902172] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.902332] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rpc_response_timeout = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.902486] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] run_external_periodic_tasks = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.902650] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] running_deleted_instance_action = reap {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.902806] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] running_deleted_instance_poll_interval = 1800 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.902962] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] running_deleted_instance_timeout = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.903138] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler_instance_sync_interval = 120 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.903303] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_down_time = 720 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.903464] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] servicegroup_driver = db {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.903620] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] shelved_offload_time = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.903773] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] shelved_poll_interval = 3600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.903936] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] shutdown_timeout = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.904133] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] source_is_ipv6 = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.904303] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ssl_only = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.904545] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.904712] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] sync_power_state_interval = 600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.904871] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] sync_power_state_pool_size = 1000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.905047] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] syslog_log_facility = LOG_USER {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.905209] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] tempdir = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.905380] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] timeout_nbd = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.905531] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] transport_url = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.905686] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] update_resources_interval = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.905840] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] use_cow_images = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.905993] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] use_eventlog = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.906182] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] use_journal = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.906341] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] use_json = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.906495] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] use_rootwrap_daemon = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.906646] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] use_stderr = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907129] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] use_syslog = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907129] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vcpu_pin_set = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907129] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plugging_is_fatal = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907271] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plugging_timeout = 300 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907417] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] virt_mkfs = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907570] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] volume_usage_poll_interval = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907724] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] watch_log_file = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.907884] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] web = /usr/share/spice-html5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 518.908080] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_concurrency.disable_process_locking = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.908387] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.908578] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.908744] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.908913] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.909094] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.909262] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.909440] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.auth_strategy = keystone {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.909601] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.compute_link_prefix = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.909769] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.909939] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.dhcp_domain = novalocal {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.910129] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.enable_instance_password = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.910279] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.glance_link_prefix = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.910447] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.910617] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.910777] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.instance_list_per_project_cells = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.910933] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.list_records_by_skipping_down_cells = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.911107] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.local_metadata_per_cell = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.911273] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.max_limit = 1000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.911438] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.metadata_cache_expiration = 15 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.911611] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.neutron_default_tenant_id = default {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.911774] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.use_forwarded_for = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.911935] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.use_neutron_default_nets = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.912112] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.912277] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.912439] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.912605] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.912773] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.vendordata_dynamic_targets = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.912930] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.vendordata_jsonfile_path = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.913120] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.913354] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.backend = dogpile.cache.memcached {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.913471] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.backend_argument = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.913641] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.config_prefix = cache.oslo {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.913808] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.dead_timeout = 60.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.913971] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.debug_cache_backend = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.914196] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.enable_retry_client = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.914368] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.enable_socket_keepalive = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.914541] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.enabled = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.914707] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.expiration_time = 600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.914868] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.hashclient_retry_attempts = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.915041] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.hashclient_retry_delay = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.915210] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_dead_retry = 300 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.915376] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_password = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.915539] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.915699] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.915859] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_pool_maxsize = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.916049] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.916229] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_sasl_enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.916414] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.916581] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_socket_timeout = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.916748] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.memcache_username = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.916910] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.proxies = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.917086] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.retry_attempts = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.917253] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.retry_delay = 0.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.917416] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.socket_keepalive_count = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.917577] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.socket_keepalive_idle = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.917735] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.socket_keepalive_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.917888] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.tls_allowed_ciphers = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.918053] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.tls_cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.918214] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.tls_certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.918390] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.tls_enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.918555] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cache.tls_keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.918723] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.918897] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.auth_type = password {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.919067] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.919273] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.catalog_info = volumev3::publicURL {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.919410] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.919564] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.919723] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.cross_az_attach = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.919881] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.debug = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.920051] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.endpoint_template = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.920229] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.http_retries = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.920379] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.920535] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.920704] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.os_region_name = RegionOne {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.920866] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.921036] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cinder.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.921212] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.921371] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.cpu_dedicated_set = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.921527] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.cpu_shared_set = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.921689] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.image_type_exclude_list = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.921850] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.922015] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.max_concurrent_disk_ops = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.922184] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.max_disk_devices_to_attach = -1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.922343] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.922510] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.922676] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.resource_provider_association_refresh = 300 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.922842] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.shutdown_retry_interval = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.923029] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.923215] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] conductor.workers = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.923387] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] console.allowed_origins = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.923543] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] console.ssl_ciphers = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.923711] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] console.ssl_minimum_version = default {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.923879] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] consoleauth.token_ttl = 600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.924075] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.924249] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.924413] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.924572] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.connect_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.924728] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.connect_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.924885] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.endpoint_override = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.925055] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.925222] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.925378] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.max_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.925533] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.min_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.925687] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.region_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.925840] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.service_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.926047] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.service_type = accelerator {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.926204] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.926366] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.status_code_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.926520] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.status_code_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.926682] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.926853] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.927436] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] cyborg.version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.927436] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.backend = sqlalchemy {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.927436] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.connection = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928529] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.connection_debug = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928529] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.connection_parameters = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928529] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.connection_recycle_time = 3600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928529] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.connection_trace = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928529] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.db_inc_retry_interval = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928529] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.db_max_retries = 20 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928695] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.db_max_retry_interval = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928695] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.db_retry_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928906] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.max_overflow = 50 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.928993] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.max_pool_size = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.929147] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.max_retries = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.929316] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.929510] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.mysql_wsrep_sync_wait = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.929633] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.pool_timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.929799] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.retry_interval = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.929954] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.slave_connection = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.930134] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.sqlite_synchronous = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.930328] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] database.use_db_reconnect = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.930474] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.backend = sqlalchemy {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.930649] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.connection = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.930814] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.connection_debug = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.930981] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.connection_parameters = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.931159] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.connection_recycle_time = 3600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.931327] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.connection_trace = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.931488] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.db_inc_retry_interval = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.931650] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.db_max_retries = 20 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.931809] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.db_max_retry_interval = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.931972] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.db_retry_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.932149] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.max_overflow = 50 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.932310] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.max_pool_size = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.932474] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.max_retries = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.932640] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.932797] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.932955] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.pool_timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.933142] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.retry_interval = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.933305] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.slave_connection = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.933467] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] api_database.sqlite_synchronous = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.933638] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] devices.enabled_mdev_types = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.933809] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.933969] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ephemeral_storage_encryption.enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.934182] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.934358] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.api_servers = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.934521] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.934681] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.934842] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.934999] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.connect_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.935173] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.connect_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.935333] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.debug = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.935493] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.default_trusted_certificate_ids = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.935652] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.enable_certificate_validation = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.935810] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.enable_rbd_download = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.935965] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.endpoint_override = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.936165] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.936332] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.936490] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.max_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.936643] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.min_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.936800] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.num_retries = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.936966] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.rbd_ceph_conf = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.937137] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.rbd_connect_timeout = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.937305] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.rbd_pool = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.937472] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.rbd_user = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.937630] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.region_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.937785] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.service_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.937951] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.service_type = image {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.938123] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.938286] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.status_code_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.938438] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.status_code_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.938595] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.938771] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.938931] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.verify_glance_signatures = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.939101] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] glance.version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.939272] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] guestfs.debug = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.939441] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.config_drive_cdrom = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.939602] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.config_drive_inject_password = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.939766] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.939925] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.enable_instance_metrics_collection = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.940096] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.enable_remotefx = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.940270] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.instances_path_share = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.940434] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.iscsi_initiator_list = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.940592] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.limit_cpu_features = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.940748] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.940907] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.941075] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.power_state_check_timeframe = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.941246] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.941413] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.941570] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.use_multipath_io = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.941728] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.volume_attach_retry_count = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.941886] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.942050] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.vswitch_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.942214] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.942379] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] mks.enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.942735] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.942924] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] image_cache.manager_interval = 2400 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.943105] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] image_cache.precache_concurrency = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.943281] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] image_cache.remove_unused_base_images = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.943451] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.943621] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.943794] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] image_cache.subdirectory_name = _base {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.943971] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.api_max_retries = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.944165] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.api_retry_interval = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.944331] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.944493] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.auth_type = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.944650] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.944804] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.944965] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.945139] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.conductor_group = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.945298] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.connect_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.945456] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.connect_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.945610] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.endpoint_override = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.945768] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.945924] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.946119] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.max_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.946293] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.min_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.946458] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.peer_list = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.946613] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.region_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.946774] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.serial_console_state_timeout = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.946928] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.service_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.947104] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.service_type = baremetal {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.947268] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.947422] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.status_code_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.947575] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.status_code_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.947729] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.947905] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.948075] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ironic.version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.948263] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.948435] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] key_manager.fixed_key = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.948615] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.948774] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.barbican_api_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.948931] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.barbican_endpoint = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.949109] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.barbican_endpoint_type = public {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.949270] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.barbican_region_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.949426] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.949581] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.949740] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.949897] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.950060] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.950226] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.number_of_retries = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.950385] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.retry_delay = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.950541] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.send_service_user_token = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.950698] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.950851] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.951016] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.verify_ssl = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.951176] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican.verify_ssl_path = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.951342] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.951501] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.auth_type = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.951659] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.951813] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.951972] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.952143] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.952301] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.952461] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.952618] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] barbican_service_user.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.952783] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.approle_role_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.952940] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.approle_secret_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.953115] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.953276] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.953437] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.953594] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.953748] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.953930] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.kv_mountpoint = secret {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.954139] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.kv_path = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.954316] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.kv_version = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.954477] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.namespace = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.954636] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.root_token_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.954795] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.954949] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.ssl_ca_crt_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.955119] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.955284] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.use_ssl = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.955452] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.955617] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.955779] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.auth_type = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.955935] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.956126] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.956300] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.956457] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.connect_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.956614] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.connect_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.956769] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.endpoint_override = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.956925] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.957091] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.957250] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.max_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.957403] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.min_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.957554] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.region_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.957704] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.service_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.957867] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.service_type = identity {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.958034] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.958194] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.status_code_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.958352] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.status_code_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.958504] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.958682] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.958839] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] keystone.version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.959046] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.connection_uri = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.959214] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.cpu_mode = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.959379] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.cpu_model_extra_flags = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.959546] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.cpu_models = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.959713] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.cpu_power_governor_high = performance {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.959879] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.cpu_power_governor_low = powersave {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.960049] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.cpu_power_management = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.960222] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.960384] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.device_detach_attempts = 8 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.960542] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.device_detach_timeout = 20 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.960705] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.disk_cachemodes = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.960863] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.disk_prefix = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.961033] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.enabled_perf_events = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.961205] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.file_backed_memory = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.961367] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.gid_maps = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.961520] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.hw_disk_discard = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.961674] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.hw_machine_type = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.961838] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.images_rbd_ceph_conf = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.961997] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.962180] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.962350] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.images_rbd_glance_store_name = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.962512] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.images_rbd_pool = rbd {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.962678] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.images_type = default {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.962834] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.images_volume_group = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.962992] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.inject_key = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.963169] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.inject_partition = -2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.963328] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.inject_password = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.963485] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.iscsi_iface = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.963645] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.iser_use_multipath = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.963805] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_bandwidth = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.963986] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.964198] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_downtime = 500 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.964361] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.964528] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.964694] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_inbound_addr = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.964855] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.965020] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_permit_post_copy = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.965186] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_scheme = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.965358] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_timeout_action = abort {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.965518] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_tunnelled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.965674] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_uri = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.965832] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.live_migration_with_native_tls = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.965996] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.max_queues = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.966189] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.966353] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.nfs_mount_options = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.966674] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.966847] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.967024] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.num_iser_scan_tries = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.967187] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.num_memory_encrypted_guests = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.967348] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.967508] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.num_pcie_ports = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.967671] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.num_volume_scan_tries = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.967832] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.pmem_namespaces = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.967989] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.quobyte_client_cfg = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.968295] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.968467] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rbd_connect_timeout = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.968630] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.968791] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.968951] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rbd_secret_uuid = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.969121] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rbd_user = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.969287] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.969455] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.remote_filesystem_transport = ssh {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.969613] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rescue_image_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.969770] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rescue_kernel_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.969925] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rescue_ramdisk_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.970103] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.970265] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.rx_queue_size = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.970432] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.smbfs_mount_options = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.970756] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.970877] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.snapshot_compression = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.971045] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.snapshot_image_format = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.971272] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.971439] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.sparse_logical_volumes = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.971601] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.swtpm_enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.971768] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.swtpm_group = tss {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.971934] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.swtpm_user = tss {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.972121] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.sysinfo_serial = unique {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.972282] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.tb_cache_size = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.972439] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.tx_queue_size = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.972600] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.uid_maps = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.972760] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.use_virtio_for_bridges = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.972927] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.virt_type = kvm {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.973105] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.volume_clear = zero {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.973272] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.volume_clear_size = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.973435] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.volume_use_multipath = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.973591] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.vzstorage_cache_path = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.973756] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.973941] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.vzstorage_mount_group = qemu {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.974149] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.vzstorage_mount_opts = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.974329] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.974605] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.974779] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.vzstorage_mount_user = stack {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.974944] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.975129] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.975303] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.auth_type = password {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.975464] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.975625] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.975786] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.975944] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.connect_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.976140] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.connect_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.976324] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.default_floating_pool = public {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.976487] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.endpoint_override = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.976650] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.extension_sync_interval = 600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.976809] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.http_retries = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.976968] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.977150] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.977324] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.max_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.977494] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.977651] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.min_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.977814] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.ovs_bridge = br-int {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.977977] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.physnets = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.978158] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.region_name = RegionOne {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.978330] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.service_metadata_proxy = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.978490] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.service_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.978658] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.service_type = network {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.978817] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.978974] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.status_code_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.979168] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.status_code_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.979333] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.979513] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.979674] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] neutron.version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.979845] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] notifications.bdms_in_notifications = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.980023] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] notifications.default_level = INFO {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.980203] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] notifications.notification_format = unversioned {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.980370] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] notifications.notify_on_state_change = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.980545] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.980722] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] pci.alias = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.980891] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] pci.device_spec = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.981065] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] pci.report_in_placement = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.981243] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.981414] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.auth_type = password {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.981579] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.981736] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.981890] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.982060] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.982222] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.connect_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.982435] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.connect_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.982624] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.default_domain_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.982784] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.default_domain_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.982941] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.domain_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.983110] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.domain_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.983273] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.endpoint_override = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.983435] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.983592] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.983745] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.max_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.983903] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.min_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.984128] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.password = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.984322] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.project_domain_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.984495] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.project_domain_name = Default {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.984661] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.project_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.984833] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.project_name = service {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.984998] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.region_name = RegionOne {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.985174] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.service_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.985340] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.service_type = placement {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.985500] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.985658] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.status_code_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.985815] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.status_code_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.985975] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.system_scope = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.986168] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.986335] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.trust_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.986492] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.user_domain_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.986657] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.user_domain_name = Default {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.986816] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.user_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.986989] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.username = placement {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.987181] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.987345] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] placement.version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.987521] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.cores = 20 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.987684] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.count_usage_from_placement = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.987851] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.988022] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.injected_file_content_bytes = 10240 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.988194] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.injected_file_path_length = 255 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.988362] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.injected_files = 5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.988531] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.instances = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.988697] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.key_pairs = 100 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.988862] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.metadata_items = 128 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.989039] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.ram = 51200 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.989209] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.recheck_quota = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.989375] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.server_group_members = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.989539] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] quota.server_groups = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.989703] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rdp.enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.990028] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.990221] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.990388] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.990553] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.image_metadata_prefilter = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.990715] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.990879] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.max_attempts = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.991050] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.max_placement_results = 1000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.991219] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.991378] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.query_placement_for_image_type_support = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.991538] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.991709] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] scheduler.workers = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.991878] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.992056] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.992241] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.992411] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.992576] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.992736] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.992898] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.993097] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.993269] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.host_subset_size = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.993433] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.993590] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.993750] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.993929] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.isolated_hosts = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.994128] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.isolated_images = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.994300] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.994462] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.994629] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.994785] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.pci_in_placement = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.994947] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.995120] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.995289] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.995452] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.995614] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.995776] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.995936] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.track_instance_changes = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.996146] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.996327] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] metrics.required = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.996493] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] metrics.weight_multiplier = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.996656] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.996818] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] metrics.weight_setting = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.997124] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.997304] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] serial_console.enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.997480] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] serial_console.port_range = 10000:20000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.997650] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.997817] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.997984] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] serial_console.serialproxy_port = 6083 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.998168] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.998345] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.auth_type = password {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.998504] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.998662] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.998822] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.998981] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.999151] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.999338] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.send_service_user_token = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.999500] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.999656] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] service_user.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.999827] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.agent_enabled = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 518.999988] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.000296] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.000489] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.000660] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.html5proxy_port = 6082 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.000822] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.image_compression = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.000981] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.jpeg_compression = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.001159] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.playback_compression = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.001332] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.server_listen = 127.0.0.1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.001505] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.001665] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.streaming_mode = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.001826] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] spice.zlib_compression = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.001994] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] upgrade_levels.baseapi = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.002173] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] upgrade_levels.cert = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.002343] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] upgrade_levels.compute = auto {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.002502] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] upgrade_levels.conductor = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.002660] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] upgrade_levels.scheduler = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.002826] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.002988] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.auth_type = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.003161] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.003321] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.003482] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.003639] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.003794] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.003970] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.004153] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vendordata_dynamic_auth.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.004333] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.api_retry_count = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.004494] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.ca_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.004703] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.cache_prefix = devstack-image-cache {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.004832] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.cluster_name = testcl1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.004993] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.connection_pool_size = 10 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.005169] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.console_delay_seconds = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.005338] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.datastore_regex = ^datastore.* {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.005542] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.005714] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.host_password = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.005877] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.host_port = 443 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.006072] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.host_username = administrator@vsphere.local {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.006259] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.insecure = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.006421] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.integration_bridge = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.006582] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.maximum_objects = 100 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.006738] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.pbm_default_policy = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.006896] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.pbm_enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.007057] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.pbm_wsdl_location = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.007229] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.007388] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.serial_port_proxy_uri = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.007544] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.serial_port_service_uri = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.007707] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.task_poll_interval = 0.5 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.007875] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.use_linked_clone = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.008051] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.vnc_keymap = en-us {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.008222] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.vnc_port = 5900 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.008408] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vmware.vnc_port_total = 10000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.008607] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.auth_schemes = ['none'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.008782] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.009081] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.009271] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.009441] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.novncproxy_port = 6080 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.009617] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.server_listen = 127.0.0.1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.009785] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.009945] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.vencrypt_ca_certs = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.010117] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.vencrypt_client_cert = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.010277] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vnc.vencrypt_client_key = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.010450] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.010615] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.disable_deep_image_inspection = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.010775] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.010934] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.011104] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.011273] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.disable_rootwrap = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.011431] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.enable_numa_live_migration = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.011588] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.011745] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.011903] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.012074] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.libvirt_disable_apic = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.012270] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.012449] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.012612] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.012792] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.012988] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.013173] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.013339] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.013525] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.013692] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.013857] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.014079] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.014325] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.client_socket_timeout = 900 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.014616] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.default_pool_size = 1000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.014938] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.keep_alive = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.015188] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.max_header_line = 16384 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.015378] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.secure_proxy_ssl_header = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.015583] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.ssl_ca_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.015789] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.ssl_cert_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.015987] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.ssl_key_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.016194] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.tcp_keepidle = 600 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.016379] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.016549] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] zvm.ca_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.016710] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] zvm.cloud_connector_url = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.016998] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.017191] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] zvm.reachable_timeout = 300 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.017375] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.enforce_new_defaults = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.017551] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.enforce_scope = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.017722] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.policy_default_rule = default {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.017902] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.018087] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.policy_file = policy.yaml {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.018264] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.018447] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.018613] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.018772] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.018933] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.019116] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.019298] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.019475] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.connection_string = messaging:// {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.019642] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.enabled = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.019813] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.es_doc_type = notification {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.019978] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.es_scroll_size = 10000 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.020158] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.es_scroll_time = 2m {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.020325] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.filter_error_trace = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.020525] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.hmac_keys = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.020699] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.sentinel_service_name = mymaster {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.020867] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.socket_timeout = 0.1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.021037] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.trace_requests = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.021203] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler.trace_sqlalchemy = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.021381] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler_jaeger.process_tags = {} {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.021542] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler_jaeger.service_name_prefix = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.021704] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] profiler_otlp.service_name_prefix = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.021867] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] remote_debug.host = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.022037] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] remote_debug.port = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.022224] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.022388] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.022551] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.022715] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.022876] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.023045] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.023214] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.023375] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.023535] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.023696] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.023857] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.024033] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.024209] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.024374] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.024535] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.024716] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.024883] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.025054] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.025218] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.025382] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.025542] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.025724] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.025965] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.026179] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.026356] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.026526] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.ssl = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.026699] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.026870] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.027043] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.027223] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.027393] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_rabbit.ssl_version = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.027580] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.027747] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_notifications.retry = -1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.027928] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.028114] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_messaging_notifications.transport_url = **** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.028291] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.auth_section = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.028478] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.auth_type = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.028641] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.cafile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.028812] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.certfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.028977] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.collect_timing = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.029151] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.connect_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.029314] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.connect_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.029470] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.endpoint_id = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.029626] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.endpoint_override = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.029783] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.insecure = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.029936] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.keyfile = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.030104] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.max_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.030266] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.min_version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.030419] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.region_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.030573] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.service_name = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.030726] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.service_type = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.030880] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.split_loggers = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.031044] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.status_code_retries = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.031209] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.status_code_retry_delay = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.031369] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.timeout = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.031518] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.valid_interfaces = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.031669] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_limit.version = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.031828] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_reports.file_event_handler = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.031990] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.032162] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] oslo_reports.log_dir = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.032332] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.032531] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.032704] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.032872] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.033045] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.033210] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.033379] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.033538] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_ovs_privileged.group = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.033695] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.033857] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.034026] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.034185] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] vif_plug_ovs_privileged.user = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.034354] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.flat_interface = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.034529] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.034700] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.034869] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.035045] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.035219] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.035385] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.035543] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.035722] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.035890] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_ovs.isolate_vif = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.036085] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.036262] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.036434] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.036601] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_ovs.ovsdb_interface = native {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.036761] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_vif_ovs.per_port_bridge = False {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.036926] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_brick.lock_path = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.037104] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.037272] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.037441] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] privsep_osbrick.capabilities = [21] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.037599] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] privsep_osbrick.group = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.037753] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] privsep_osbrick.helper_command = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.037916] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.038086] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.038246] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] privsep_osbrick.user = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.038440] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.038604] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] nova_sys_admin.group = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.038759] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] nova_sys_admin.helper_command = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.038922] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.039093] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.039256] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] nova_sys_admin.user = None {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 519.039384] env[67820]: DEBUG oslo_service.service [None req-0081236e-3c1c-4a6d-917e-396f534b380b None None] ******************************************************************************** {{(pid=67820) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 519.039805] env[67820]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 519.050680] env[67820]: WARNING nova.virt.vmwareapi.driver [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 519.051154] env[67820]: INFO nova.virt.node [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Generated node identity 0f792661-ec04-4fc2-898f-e9860339eddd [ 519.051383] env[67820]: INFO nova.virt.node [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Wrote node identity 0f792661-ec04-4fc2-898f-e9860339eddd to /opt/stack/data/n-cpu-1/compute_id [ 519.064167] env[67820]: WARNING nova.compute.manager [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Compute nodes ['0f792661-ec04-4fc2-898f-e9860339eddd'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 519.099188] env[67820]: INFO nova.compute.manager [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 519.122105] env[67820]: WARNING nova.compute.manager [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 519.122351] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 519.122569] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 519.122714] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 519.122871] env[67820]: DEBUG nova.compute.resource_tracker [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 519.124073] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09eca5a4-17e5-43ee-8dce-b606af034e1b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.132598] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a63af0d3-d423-4009-9282-ce3155d1b010 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.146265] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a468202d-38b3-492a-8b4d-d8f6eb78c1fa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.152231] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcf795bd-fc59-412d-8462-f45f0388543c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.181648] env[67820]: DEBUG nova.compute.resource_tracker [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 519.181770] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 519.181938] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 519.193746] env[67820]: WARNING nova.compute.resource_tracker [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] No compute node record for cpu-1:0f792661-ec04-4fc2-898f-e9860339eddd: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 0f792661-ec04-4fc2-898f-e9860339eddd could not be found. [ 519.207167] env[67820]: INFO nova.compute.resource_tracker [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 0f792661-ec04-4fc2-898f-e9860339eddd [ 519.259399] env[67820]: DEBUG nova.compute.resource_tracker [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 519.259599] env[67820]: DEBUG nova.compute.resource_tracker [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 519.363384] env[67820]: INFO nova.scheduler.client.report [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] [req-c2384116-c91b-45a8-9baa-4d6b3b0c5646] Created resource provider record via placement API for resource provider with UUID 0f792661-ec04-4fc2-898f-e9860339eddd and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 519.380968] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30106180-1a1c-4fd0-afe1-a429d81c9895 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.388701] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb42ead-371a-40a7-bc48-3e66f7b99fe0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.418663] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33a74269-03ce-4431-9e6d-f18de4c3fda2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.425771] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65391543-d8c5-47a9-bcbb-0ee8a2fd26e5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 519.438887] env[67820]: DEBUG nova.compute.provider_tree [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 519.479379] env[67820]: DEBUG nova.scheduler.client.report [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Updated inventory for provider 0f792661-ec04-4fc2-898f-e9860339eddd with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 519.479642] env[67820]: DEBUG nova.compute.provider_tree [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Updating resource provider 0f792661-ec04-4fc2-898f-e9860339eddd generation from 0 to 1 during operation: update_inventory {{(pid=67820) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 519.479809] env[67820]: DEBUG nova.compute.provider_tree [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 519.526061] env[67820]: DEBUG nova.compute.provider_tree [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Updating resource provider 0f792661-ec04-4fc2-898f-e9860339eddd generation from 1 to 2 during operation: update_traits {{(pid=67820) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 519.542575] env[67820]: DEBUG nova.compute.resource_tracker [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 519.542776] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.361s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 519.542940] env[67820]: DEBUG nova.service [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Creating RPC server for service compute {{(pid=67820) start /opt/stack/nova/nova/service.py:182}} [ 519.557628] env[67820]: DEBUG nova.service [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] Join ServiceGroup membership for this service compute {{(pid=67820) start /opt/stack/nova/nova/service.py:199}} [ 519.557811] env[67820]: DEBUG nova.servicegroup.drivers.db [None req-1bc41e35-85c8-428c-8317-e3033959faef None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67820) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 528.879691] env[67820]: DEBUG dbcounter [-] [67820] Writing DB stats nova_cell1:SELECT=1 {{(pid=67820) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 528.881063] env[67820]: DEBUG dbcounter [-] [67820] Writing DB stats nova_cell0:SELECT=1 {{(pid=67820) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 539.560459] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 539.571647] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Getting list of instances from cluster (obj){ [ 539.571647] env[67820]: value = "domain-c8" [ 539.571647] env[67820]: _type = "ClusterComputeResource" [ 539.571647] env[67820]: } {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 539.572983] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93c5fd87-48e0-4cd5-9cc1-2c8fcc9328db {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 539.582168] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Got total of 0 instances {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 539.582393] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 539.582705] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Getting list of instances from cluster (obj){ [ 539.582705] env[67820]: value = "domain-c8" [ 539.582705] env[67820]: _type = "ClusterComputeResource" [ 539.582705] env[67820]: } {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 539.583552] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c2ca9bb-6a19-4310-8d92-70445d0b2ddd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 539.590623] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Got total of 0 instances {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 562.314388] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "d08e77ae-af85-4dfe-86e7-60f850369485" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 562.314679] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d08e77ae-af85-4dfe-86e7-60f850369485" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.350281] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 562.503142] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 562.503469] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.505253] env[67820]: INFO nova.compute.claims [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 562.679649] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-055a7702-391e-45dc-8513-1cbe1dcc922a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.695257] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfcf4303-e879-4290-ae4d-13e4264174b2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.736373] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4fbb798-c120-4f61-bf4d-ce22e28fa7b4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.744897] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-464c271a-09be-422f-8928-415e1bce94ea {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.760710] env[67820]: DEBUG nova.compute.provider_tree [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 562.779078] env[67820]: DEBUG nova.scheduler.client.report [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 562.811478] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 562.812104] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 562.860989] env[67820]: DEBUG nova.compute.utils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 562.862923] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 562.863191] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 562.887062] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 562.984607] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "ca9f607c-cde9-459e-aa9e-4b060bc8a68b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 562.984773] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "ca9f607c-cde9-459e-aa9e-4b060bc8a68b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.987328] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 563.006219] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 563.087103] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 563.087402] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 563.089639] env[67820]: INFO nova.compute.claims [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 563.239854] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58108497-052d-43b1-b10d-14b3cd8fbff5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.254902] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65b75de5-e493-473c-94fe-b169aebd2a4f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.297982] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c3c3d94-c683-4350-864b-8a3152e30b08 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.307397] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3385361d-0705-4f5c-804c-3b973ae01831 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.324042] env[67820]: DEBUG nova.compute.provider_tree [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 563.348109] env[67820]: DEBUG nova.scheduler.client.report [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 563.369896] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 563.370600] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 563.417194] env[67820]: DEBUG nova.compute.utils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 563.419488] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 563.419563] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 563.434506] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 563.434838] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 563.434932] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 563.435156] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 563.435266] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 563.435415] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 563.435629] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 563.435786] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 563.436169] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 563.436345] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 563.436518] env[67820]: DEBUG nova.virt.hardware [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 563.438108] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6afc8c9c-1943-4ba9-84ea-44e8c4ab8722 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.445597] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 563.456277] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1424f9f-ae8b-482c-9460-b3461f993e64 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.476448] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1577cb8-21db-4ab5-a1da-fddecf05fce5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.540305] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 563.577690] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 563.577925] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 563.578107] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 563.578756] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 563.578756] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 563.578756] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 563.578894] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 563.578927] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 563.579101] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 563.579269] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 563.582202] env[67820]: DEBUG nova.virt.hardware [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 563.582202] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d45f0eaa-09b4-488e-b8aa-527c40830c43 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.592803] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb823121-a0b2-4f13-a621-e4ca56ec4eba {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.678693] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquiring lock "11335aca-6576-4d27-b50b-95ecc0c157a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 563.678931] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Lock "11335aca-6576-4d27-b50b-95ecc0c157a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 563.697168] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 563.781399] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 563.781727] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 563.783524] env[67820]: INFO nova.compute.claims [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 563.850189] env[67820]: DEBUG nova.policy [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd81a07fc3b4c470c8f5a087d8825c5df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '890ffca423414cd69eca6a6bf4d1ac66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 563.877508] env[67820]: DEBUG nova.policy [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '20c0e91a2a46450980bef25b5a373f6a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b141581737a44c5894416bcaa7af709', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 563.942306] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a4fbe74-cb46-4468-8c37-273fdbfa5438 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.952260] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaf71329-f21c-486e-87da-15f0586ebaf5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.992823] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27e727d6-69dd-4352-bf0a-abf34659a1c4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.000884] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3824fe1-2e58-432b-8d5d-8aa81a43c9df {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.016418] env[67820]: DEBUG nova.compute.provider_tree [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 564.027277] env[67820]: DEBUG nova.scheduler.client.report [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 564.052085] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 564.052713] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 564.109687] env[67820]: DEBUG nova.compute.utils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 564.114870] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Not allocating networking since 'none' was specified. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 564.134112] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 564.258248] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 564.291656] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 564.291656] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 564.291656] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 564.291854] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 564.291854] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 564.291854] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 564.291985] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 564.295452] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 564.295452] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 564.295452] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 564.295452] env[67820]: DEBUG nova.virt.hardware [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 564.295452] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76672891-589e-42f5-b68d-b5ad08ef22a2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.303131] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35bb8845-9b4f-4bc9-baf4-c068c0364b03 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.317932] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Instance VIF info [] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 564.328441] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.328758] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-18f26b5c-71a5-4184-a53e-5231d2881a35 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.340330] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Created folder: OpenStack in parent group-v4. [ 564.340494] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Creating folder: Project (e74f660159074c58a05659ef95889177). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.340729] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4a2af7b0-c9eb-4d29-b80a-19999f136197 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.352877] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Created folder: Project (e74f660159074c58a05659ef95889177) in parent group-v692668. [ 564.352877] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Creating folder: Instances. Parent ref: group-v692669. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.352877] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-668971b6-c5ce-4863-8a21-5f3e5cf0417a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.362058] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Created folder: Instances in parent group-v692669. [ 564.364176] env[67820]: DEBUG oslo.service.loopingcall [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 564.364176] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 564.364176] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b6c790e4-c12f-4484-b824-11a8c57faa53 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.380691] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 564.380691] env[67820]: value = "task-3467271" [ 564.380691] env[67820]: _type = "Task" [ 564.380691] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 564.391723] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467271, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 564.819363] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Successfully created port: 5dba3972-d1aa-4f24-a053-15173a6a1b17 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 564.896143] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467271, 'name': CreateVM_Task, 'duration_secs': 0.357073} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 564.896424] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 564.898291] env[67820]: DEBUG oslo_vmware.service [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e32a75e5-f358-44c1-b46f-c7be2fb15f87 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.913754] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 564.914126] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 564.916038] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 564.916038] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b9dcb21d-3e8c-4d05-8aa0-9598b6ab630f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 564.923324] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Waiting for the task: (returnval){ [ 564.923324] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]527f6ac0-07c1-bd64-ef0b-67f65c3789b9" [ 564.923324] env[67820]: _type = "Task" [ 564.923324] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 564.936817] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]527f6ac0-07c1-bd64-ef0b-67f65c3789b9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 565.066828] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Successfully created port: 75d491ce-6569-4490-80f6-62f2ff27a2db {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 565.439761] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 565.443051] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 565.443051] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 565.443051] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 565.443051] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 565.443275] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-825ce37b-6c14-41ab-9496-4acde7ade28b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.467638] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 565.467638] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 565.468690] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-880f0f0c-c57e-4540-9ba0-537b6c6d468c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.476426] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-216bf200-0f3f-4b34-b1d5-80015e6545b8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.482070] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Waiting for the task: (returnval){ [ 565.482070] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52dbb998-ee21-1bf1-7d09-97c48bd206d4" [ 565.482070] env[67820]: _type = "Task" [ 565.482070] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 565.495932] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52dbb998-ee21-1bf1-7d09-97c48bd206d4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 565.528050] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 565.528508] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 565.542269] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 565.636735] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 565.637222] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 565.638840] env[67820]: INFO nova.compute.claims [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 565.864604] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18233ca4-b73b-401b-9a0e-0c820408f916 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.874261] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39c7c92c-e582-442c-a683-be15663f9604 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.923435] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93714dd6-2f88-4017-8710-68058d663f30 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.932295] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04770a56-4cb6-43ca-a212-8330f9b6477c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.944345] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquiring lock "14478951-d2c1-4472-af0c-354757e0bb0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 565.944605] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Lock "14478951-d2c1-4472-af0c-354757e0bb0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 565.959392] env[67820]: DEBUG nova.compute.provider_tree [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 565.966920] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 565.988557] env[67820]: DEBUG nova.scheduler.client.report [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 566.001499] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 566.002241] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Creating directory with path [datastore1] vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 566.002873] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3561c7a2-27ae-4965-a734-a4d57d4f3618 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.027008] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.387s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 566.027008] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 566.032284] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Created directory with path [datastore1] vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 566.032584] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Fetch image to [datastore1] vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 566.032698] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 566.033592] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e959f368-f269-41b0-b86a-3b9e9477b112 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.047461] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bb5e5fb-9f85-407c-b07c-45e7a675b6f6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.066756] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22a96ef9-5007-4460-81d2-7a7af4779f80 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.075130] env[67820]: DEBUG nova.compute.utils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 566.104937] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 566.105304] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 566.110583] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 566.114387] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 566.114669] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 566.118356] env[67820]: INFO nova.compute.claims [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 566.122866] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-253633c4-32a1-4939-89e8-df6899231fba {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.141133] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f75802a4-ecbe-401a-8d59-bd14f2c2a693 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.182375] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 566.232319] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 566.258755] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 566.259013] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 566.259682] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 566.259682] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 566.259682] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 566.259682] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 566.259917] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 566.259997] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 566.260310] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 566.260736] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 566.260736] env[67820]: DEBUG nova.virt.hardware [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 566.261777] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39ff38a9-5020-4f42-b9a1-e6175c500aae {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.268743] env[67820]: DEBUG oslo_vmware.rw_handles [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 566.336612] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79bd2edd-7e03-4639-b197-b00ba043c990 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.342771] env[67820]: DEBUG nova.policy [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25977dff844b4d73aab55d1e61f4ef1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f77ce380931343b591d73e4966a830d5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 566.345304] env[67820]: DEBUG oslo_vmware.rw_handles [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 566.345304] env[67820]: DEBUG oslo_vmware.rw_handles [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 566.371946] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d0ae274-11f4-4083-adf7-16ab7f6e911e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.380009] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e9b5e6a-ab34-4faa-a530-2f1fabcf9c90 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.412553] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44769e75-ee95-42de-b753-b437185a83c7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.420386] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84cc080b-64a1-4e46-84ac-ba0173c215f5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.434510] env[67820]: DEBUG nova.compute.provider_tree [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 566.445957] env[67820]: DEBUG nova.scheduler.client.report [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 566.463626] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 566.464120] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 566.503676] env[67820]: DEBUG nova.compute.utils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 566.505845] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 566.505845] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 566.515903] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 566.606292] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 566.639985] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 566.640281] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 566.641175] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 566.641463] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 566.641625] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 566.641776] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 566.641991] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 566.642165] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 566.642332] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 566.642492] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 566.642658] env[67820]: DEBUG nova.virt.hardware [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 566.643973] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0466aca-7c74-4e5d-8d32-e7c0974a9e45 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.653461] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b2836bd-395f-4681-8948-c548bb00ada1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.719013] env[67820]: DEBUG nova.policy [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf775f4f86b145d895e43e99ff84714d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fddbbfb428ea4faba97457968105ad2e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 566.839808] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "be35d888-f649-44e4-af23-341b8bfc81f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 566.840212] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "be35d888-f649-44e4-af23-341b8bfc81f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 566.853225] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 566.915871] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 566.917820] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 566.919232] env[67820]: INFO nova.compute.claims [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 567.162663] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1d7542e-822d-472d-9715-f89ce358e712 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.178550] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34cbc869-e31b-4b3a-b805-37ca7a5d7a53 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.217183] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-375c0cd8-40cd-4f97-8474-4b144eb8c41f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.226834] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4d469d8-f0d2-4678-a326-d8ffacf5544a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.243906] env[67820]: DEBUG nova.compute.provider_tree [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 567.255632] env[67820]: DEBUG nova.scheduler.client.report [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 567.286728] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 567.287312] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 567.357880] env[67820]: DEBUG nova.compute.utils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 567.359989] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 567.360139] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 567.379409] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 567.479766] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 567.507301] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Successfully updated port: 5dba3972-d1aa-4f24-a053-15173a6a1b17 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 567.524115] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 567.526069] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 567.526332] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 567.526494] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 567.526635] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 567.526802] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 567.527030] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 567.527186] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 567.527345] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 567.527498] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 567.527668] env[67820]: DEBUG nova.virt.hardware [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 567.528598] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7430ff5-7779-451a-8fbd-8093a8339ce2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.532242] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "refresh_cache-d08e77ae-af85-4dfe-86e7-60f850369485" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 567.532372] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "refresh_cache-d08e77ae-af85-4dfe-86e7-60f850369485" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 567.532515] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 567.544414] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4f0327d-b8e6-4cea-8142-c9d29868b7de {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.617438] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Successfully updated port: 75d491ce-6569-4490-80f6-62f2ff27a2db {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 567.629166] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "refresh_cache-ca9f607c-cde9-459e-aa9e-4b060bc8a68b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 567.629291] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired lock "refresh_cache-ca9f607c-cde9-459e-aa9e-4b060bc8a68b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 567.629437] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 567.870245] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "573a28e3-bfc4-4b08-919b-65acbca79c7b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 567.871520] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 567.891739] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 567.984869] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Successfully created port: b3253af5-845d-47d7-910e-44ddfa9076cf {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 567.996479] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 567.996479] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 567.998095] env[67820]: INFO nova.compute.claims [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 568.051543] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 568.077365] env[67820]: DEBUG nova.policy [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b462f02f5204efeaf3a50fec81f810d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2311daf1600d487ab20c38a98ae2c892', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 568.271648] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f933c710-65be-4797-b69a-91289e1f2dd2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.276225] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 568.290859] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-befc617b-936d-4df1-bcbf-7c94e80979de {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.326911] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e8e38a8-ed48-4f09-859c-8b2a01f9880e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.335901] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a26bb759-1d41-4ceb-8f81-9c750b3e9f9f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.351589] env[67820]: DEBUG nova.compute.provider_tree [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 568.366096] env[67820]: DEBUG nova.scheduler.client.report [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 568.382024] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 568.382522] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 568.437372] env[67820]: DEBUG nova.compute.utils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 568.439447] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 568.439447] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 568.455552] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 568.560858] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 568.593134] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 568.593588] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 568.593588] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 568.593817] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 568.593850] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 568.594034] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 568.594287] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 568.594450] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 568.594617] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 568.594988] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 568.595438] env[67820]: DEBUG nova.virt.hardware [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 568.596511] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51b9bf15-c646-48f9-ad8f-c8fa6bb8bdab {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.605673] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13f3952f-3aee-4d9e-a6ed-b74cbf3f1eaa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.863902] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Successfully created port: 2455c43c-ba7c-4799-be60-baec0227f246 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 569.042766] env[67820]: DEBUG nova.policy [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '423f7d15629a474bbea6f6e4c8490f39', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8b5b33b995674e3aae09667f533954b9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 569.899874] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Updating instance_info_cache with network_info: [{"id": "75d491ce-6569-4490-80f6-62f2ff27a2db", "address": "fa:16:3e:34:5d:8c", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.188", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap75d491ce-65", "ovs_interfaceid": "75d491ce-6569-4490-80f6-62f2ff27a2db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 569.914114] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Releasing lock "refresh_cache-ca9f607c-cde9-459e-aa9e-4b060bc8a68b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 569.914441] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Instance network_info: |[{"id": "75d491ce-6569-4490-80f6-62f2ff27a2db", "address": "fa:16:3e:34:5d:8c", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.188", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap75d491ce-65", "ovs_interfaceid": "75d491ce-6569-4490-80f6-62f2ff27a2db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 569.914960] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:5d:8c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '75d491ce-6569-4490-80f6-62f2ff27a2db', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 569.926112] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Creating folder: Project (6b141581737a44c5894416bcaa7af709). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 569.927742] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1b5f08f1-ee65-4234-a5aa-26541de1bfd7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.939575] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Created folder: Project (6b141581737a44c5894416bcaa7af709) in parent group-v692668. [ 569.939882] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Creating folder: Instances. Parent ref: group-v692672. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 569.940215] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-072f3a7b-cf2e-476c-84ed-517ec3b25b51 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.950033] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Created folder: Instances in parent group-v692672. [ 569.950305] env[67820]: DEBUG oslo.service.loopingcall [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 569.950866] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 569.951052] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a6f20a27-984d-4aa0-86d2-4ac2d3bc042e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.976134] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Updating instance_info_cache with network_info: [{"id": "5dba3972-d1aa-4f24-a053-15173a6a1b17", "address": "fa:16:3e:5f:a1:73", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5dba3972-d1", "ovs_interfaceid": "5dba3972-d1aa-4f24-a053-15173a6a1b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 569.983106] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 569.983106] env[67820]: value = "task-3467274" [ 569.983106] env[67820]: _type = "Task" [ 569.983106] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 569.991629] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467274, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 569.999141] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "refresh_cache-d08e77ae-af85-4dfe-86e7-60f850369485" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 569.999333] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Instance network_info: |[{"id": "5dba3972-d1aa-4f24-a053-15173a6a1b17", "address": "fa:16:3e:5f:a1:73", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5dba3972-d1", "ovs_interfaceid": "5dba3972-d1aa-4f24-a053-15173a6a1b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 569.999968] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5f:a1:73', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db00ec2e-3155-46b6-8170-082f7d86dbe7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5dba3972-d1aa-4f24-a053-15173a6a1b17', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 570.008697] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating folder: Project (890ffca423414cd69eca6a6bf4d1ac66). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.009575] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4c7a1a4a-4681-4fda-9811-dc23c1bbefcc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.022662] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created folder: Project (890ffca423414cd69eca6a6bf4d1ac66) in parent group-v692668. [ 570.022884] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating folder: Instances. Parent ref: group-v692675. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.023186] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4d34b31a-f7dd-408b-af93-41bc96d3da24 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.033822] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created folder: Instances in parent group-v692675. [ 570.034130] env[67820]: DEBUG oslo.service.loopingcall [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 570.034297] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 570.034476] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-116f27cb-1fee-42f3-ba74-9aad6a2e466f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.056456] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 570.056456] env[67820]: value = "task-3467277" [ 570.056456] env[67820]: _type = "Task" [ 570.056456] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 570.066660] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467277, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 570.495192] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467274, 'name': CreateVM_Task, 'duration_secs': 0.366731} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 570.495401] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 570.539373] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 570.539544] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 570.539896] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 570.540187] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-473b47ec-8d40-4814-ad4c-e2f17e17abbd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.546861] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for the task: (returnval){ [ 570.546861] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52ced6dc-0894-65c0-3791-3f550bea896d" [ 570.546861] env[67820]: _type = "Task" [ 570.546861] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 570.559955] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52ced6dc-0894-65c0-3791-3f550bea896d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 570.572308] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467277, 'name': CreateVM_Task, 'duration_secs': 0.353237} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 570.572551] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 570.573157] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 571.061962] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 571.061962] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 571.061962] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 571.061962] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 571.062622] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 571.062622] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5e44d023-0e5d-4947-b519-1268d6e94d7f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 571.072615] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 571.072615] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52164dcb-35c7-ee14-8100-60178d29a996" [ 571.072615] env[67820]: _type = "Task" [ 571.072615] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 571.083839] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Successfully updated port: b3253af5-845d-47d7-910e-44ddfa9076cf {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 571.092088] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 571.092346] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 571.092558] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 571.106351] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Successfully created port: f518be0c-8624-4aef-8248-c09544c76756 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 571.110614] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquiring lock "refresh_cache-14478951-d2c1-4472-af0c-354757e0bb0b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 571.110748] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquired lock "refresh_cache-14478951-d2c1-4472-af0c-354757e0bb0b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 571.110890] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 571.343095] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 571.635050] env[67820]: DEBUG nova.compute.manager [req-ff67a05e-b960-4fda-bbfd-eb6282ddb0dd req-a058fc40-6be6-44e8-9706-56518a4d31d6 service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Received event network-vif-plugged-5dba3972-d1aa-4f24-a053-15173a6a1b17 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 571.635319] env[67820]: DEBUG oslo_concurrency.lockutils [req-ff67a05e-b960-4fda-bbfd-eb6282ddb0dd req-a058fc40-6be6-44e8-9706-56518a4d31d6 service nova] Acquiring lock "d08e77ae-af85-4dfe-86e7-60f850369485-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.635611] env[67820]: DEBUG oslo_concurrency.lockutils [req-ff67a05e-b960-4fda-bbfd-eb6282ddb0dd req-a058fc40-6be6-44e8-9706-56518a4d31d6 service nova] Lock "d08e77ae-af85-4dfe-86e7-60f850369485-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.635670] env[67820]: DEBUG oslo_concurrency.lockutils [req-ff67a05e-b960-4fda-bbfd-eb6282ddb0dd req-a058fc40-6be6-44e8-9706-56518a4d31d6 service nova] Lock "d08e77ae-af85-4dfe-86e7-60f850369485-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.635860] env[67820]: DEBUG nova.compute.manager [req-ff67a05e-b960-4fda-bbfd-eb6282ddb0dd req-a058fc40-6be6-44e8-9706-56518a4d31d6 service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] No waiting events found dispatching network-vif-plugged-5dba3972-d1aa-4f24-a053-15173a6a1b17 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 571.636117] env[67820]: WARNING nova.compute.manager [req-ff67a05e-b960-4fda-bbfd-eb6282ddb0dd req-a058fc40-6be6-44e8-9706-56518a4d31d6 service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Received unexpected event network-vif-plugged-5dba3972-d1aa-4f24-a053-15173a6a1b17 for instance with vm_state building and task_state spawning. [ 571.705899] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Successfully created port: e3a51c96-4813-4e0d-87c6-1c953da1e331 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 572.343559] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Updating instance_info_cache with network_info: [{"id": "b3253af5-845d-47d7-910e-44ddfa9076cf", "address": "fa:16:3e:c0:1d:79", "network": {"id": "8782dc75-c23f-40f8-b98d-da109d55df53", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-874469868-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fddbbfb428ea4faba97457968105ad2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31a7f15-a808-4199-9071-31fd05e316ea", "external-id": "nsx-vlan-transportzone-388", "segmentation_id": 388, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3253af5-84", "ovs_interfaceid": "b3253af5-845d-47d7-910e-44ddfa9076cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 572.359441] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Releasing lock "refresh_cache-14478951-d2c1-4472-af0c-354757e0bb0b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 572.359787] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Instance network_info: |[{"id": "b3253af5-845d-47d7-910e-44ddfa9076cf", "address": "fa:16:3e:c0:1d:79", "network": {"id": "8782dc75-c23f-40f8-b98d-da109d55df53", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-874469868-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fddbbfb428ea4faba97457968105ad2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31a7f15-a808-4199-9071-31fd05e316ea", "external-id": "nsx-vlan-transportzone-388", "segmentation_id": 388, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3253af5-84", "ovs_interfaceid": "b3253af5-845d-47d7-910e-44ddfa9076cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 572.361057] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:1d:79', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e31a7f15-a808-4199-9071-31fd05e316ea', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b3253af5-845d-47d7-910e-44ddfa9076cf', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 572.373697] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Creating folder: Project (fddbbfb428ea4faba97457968105ad2e). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.374613] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3727212a-9a12-421e-bc2b-eccc1966073a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.388167] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Created folder: Project (fddbbfb428ea4faba97457968105ad2e) in parent group-v692668. [ 572.388382] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Creating folder: Instances. Parent ref: group-v692678. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.388646] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-96076a17-7157-4c97-bc6e-e6defb8547dc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.401018] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Created folder: Instances in parent group-v692678. [ 572.401018] env[67820]: DEBUG oslo.service.loopingcall [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 572.401018] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 572.401018] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c3c8851d-4cf1-4702-8333-7652cfcca978 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.423632] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 572.423632] env[67820]: value = "task-3467280" [ 572.423632] env[67820]: _type = "Task" [ 572.423632] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 572.432669] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467280, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 572.829773] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Successfully updated port: 2455c43c-ba7c-4799-be60-baec0227f246 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 572.851151] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "refresh_cache-0421f92d-8c91-4a60-beb9-f1a799e6d1b4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 572.851297] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquired lock "refresh_cache-0421f92d-8c91-4a60-beb9-f1a799e6d1b4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 572.851443] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 572.952735] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467280, 'name': CreateVM_Task, 'duration_secs': 0.29473} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 572.952910] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 572.953632] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 572.953793] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 572.954162] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 572.954444] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-07738483-05c9-4c89-a21f-bd03dbf0ff9c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.962950] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Waiting for the task: (returnval){ [ 572.962950] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52a344f3-c5a1-34e1-a6ba-47e538e2a844" [ 572.962950] env[67820]: _type = "Task" [ 572.962950] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 572.974137] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52a344f3-c5a1-34e1-a6ba-47e538e2a844, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 573.020504] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 573.482138] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 573.482138] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 573.482430] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.503518] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "1cc3b207-a628-4fe5-8908-6879483806b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.503754] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "1cc3b207-a628-4fe5-8908-6879483806b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.515640] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 573.585690] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.585920] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.587524] env[67820]: INFO nova.compute.claims [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 573.747413] env[67820]: DEBUG nova.compute.manager [req-ced7b183-bee5-4468-b410-c7079a288572 req-91549973-82ca-4fc8-8c64-4bb4210c350f service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Received event network-vif-plugged-b3253af5-845d-47d7-910e-44ddfa9076cf {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 573.747543] env[67820]: DEBUG oslo_concurrency.lockutils [req-ced7b183-bee5-4468-b410-c7079a288572 req-91549973-82ca-4fc8-8c64-4bb4210c350f service nova] Acquiring lock "14478951-d2c1-4472-af0c-354757e0bb0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.747741] env[67820]: DEBUG oslo_concurrency.lockutils [req-ced7b183-bee5-4468-b410-c7079a288572 req-91549973-82ca-4fc8-8c64-4bb4210c350f service nova] Lock "14478951-d2c1-4472-af0c-354757e0bb0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.748874] env[67820]: DEBUG oslo_concurrency.lockutils [req-ced7b183-bee5-4468-b410-c7079a288572 req-91549973-82ca-4fc8-8c64-4bb4210c350f service nova] Lock "14478951-d2c1-4472-af0c-354757e0bb0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 573.748874] env[67820]: DEBUG nova.compute.manager [req-ced7b183-bee5-4468-b410-c7079a288572 req-91549973-82ca-4fc8-8c64-4bb4210c350f service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] No waiting events found dispatching network-vif-plugged-b3253af5-845d-47d7-910e-44ddfa9076cf {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 573.749143] env[67820]: WARNING nova.compute.manager [req-ced7b183-bee5-4468-b410-c7079a288572 req-91549973-82ca-4fc8-8c64-4bb4210c350f service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Received unexpected event network-vif-plugged-b3253af5-845d-47d7-910e-44ddfa9076cf for instance with vm_state building and task_state spawning. [ 573.813107] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da679aa1-076b-4968-82bb-e7b431a00597 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.824611] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6365d2b4-00ff-4817-b6f5-11cdc96b4291 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.859546] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66ea9109-ffc7-41e6-bf23-544943f66249 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.867791] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-647bb327-8579-4d7c-9b80-9175c5c00559 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.882993] env[67820]: DEBUG nova.compute.provider_tree [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 573.902643] env[67820]: DEBUG nova.scheduler.client.report [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 573.929025] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.343s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 573.929605] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 574.090249] env[67820]: DEBUG nova.compute.utils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 574.095781] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 574.098074] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 574.200976] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 574.228286] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Updating instance_info_cache with network_info: [{"id": "2455c43c-ba7c-4799-be60-baec0227f246", "address": "fa:16:3e:ba:00:77", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2455c43c-ba", "ovs_interfaceid": "2455c43c-ba7c-4799-be60-baec0227f246", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 574.261746] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Releasing lock "refresh_cache-0421f92d-8c91-4a60-beb9-f1a799e6d1b4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 574.262072] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Instance network_info: |[{"id": "2455c43c-ba7c-4799-be60-baec0227f246", "address": "fa:16:3e:ba:00:77", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2455c43c-ba", "ovs_interfaceid": "2455c43c-ba7c-4799-be60-baec0227f246", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 574.262475] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:00:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2455c43c-ba7c-4799-be60-baec0227f246', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 574.274518] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Creating folder: Project (f77ce380931343b591d73e4966a830d5). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 574.275610] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2e25a86-fbf6-43e1-acb2-9f99d32c96a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.295176] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Created folder: Project (f77ce380931343b591d73e4966a830d5) in parent group-v692668. [ 574.295393] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Creating folder: Instances. Parent ref: group-v692681. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 574.295668] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4462502d-e028-4f6e-b114-fe1eb414fb3b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.309162] env[67820]: DEBUG nova.policy [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bad010179beb40ba86e444ac1b542097', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '45335f4ca9ab47218fc21f73d66c7c4f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 574.313246] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Created folder: Instances in parent group-v692681. [ 574.313482] env[67820]: DEBUG oslo.service.loopingcall [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 574.313684] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 574.313914] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-106f53fb-9aa7-408b-abc0-5b0f61a78fe7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.339615] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 574.339615] env[67820]: value = "task-3467283" [ 574.339615] env[67820]: _type = "Task" [ 574.339615] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.348691] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467283, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.356037] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 574.389175] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 574.389437] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 574.389591] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 574.389763] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 574.389954] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 574.390613] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 574.390810] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 574.390969] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 574.391153] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 574.392030] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 574.392030] env[67820]: DEBUG nova.virt.hardware [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 574.392413] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca40feaf-cbf1-4ddf-81ab-2ad12b587a1c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.403018] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06a87457-4a3a-43de-a477-e160f0157d35 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.471104] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Successfully updated port: f518be0c-8624-4aef-8248-c09544c76756 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 574.487343] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "refresh_cache-be35d888-f649-44e4-af23-341b8bfc81f6" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 574.487609] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquired lock "refresh_cache-be35d888-f649-44e4-af23-341b8bfc81f6" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 574.487647] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 574.703768] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 574.851948] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467283, 'name': CreateVM_Task, 'duration_secs': 0.305772} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 574.852939] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 574.853884] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 574.853960] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 574.854877] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 574.854877] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bccc58e1-021b-40ec-b54a-3d565dd3ba4c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.859809] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Waiting for the task: (returnval){ [ 574.859809] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5277327b-6c2f-7d76-c410-592cd4a3bebb" [ 574.859809] env[67820]: _type = "Task" [ 574.859809] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.868061] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5277327b-6c2f-7d76-c410-592cd4a3bebb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 575.095449] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Successfully updated port: e3a51c96-4813-4e0d-87c6-1c953da1e331 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 575.111652] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "refresh_cache-573a28e3-bfc4-4b08-919b-65acbca79c7b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 575.111832] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquired lock "refresh_cache-573a28e3-bfc4-4b08-919b-65acbca79c7b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 575.111964] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 575.216237] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 575.371539] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 575.371967] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 575.372298] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 575.521240] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Updating instance_info_cache with network_info: [{"id": "f518be0c-8624-4aef-8248-c09544c76756", "address": "fa:16:3e:b3:6a:fc", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf518be0c-86", "ovs_interfaceid": "f518be0c-8624-4aef-8248-c09544c76756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.523022] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Successfully created port: 5e92dc3b-a08e-4e33-b663-d2ad36167103 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 575.539921] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Releasing lock "refresh_cache-be35d888-f649-44e4-af23-341b8bfc81f6" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 575.540257] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Instance network_info: |[{"id": "f518be0c-8624-4aef-8248-c09544c76756", "address": "fa:16:3e:b3:6a:fc", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf518be0c-86", "ovs_interfaceid": "f518be0c-8624-4aef-8248-c09544c76756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 575.541296] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b3:6a:fc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f518be0c-8624-4aef-8248-c09544c76756', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 575.548916] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Creating folder: Project (2311daf1600d487ab20c38a98ae2c892). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 575.549510] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-46badcce-ac99-4b13-a07d-eb0f8433d7ce {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.562122] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Created folder: Project (2311daf1600d487ab20c38a98ae2c892) in parent group-v692668. [ 575.562122] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Creating folder: Instances. Parent ref: group-v692684. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 575.562368] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7f9470f0-6276-4fd3-9183-c0be9ed607e0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.574025] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Created folder: Instances in parent group-v692684. [ 575.574025] env[67820]: DEBUG oslo.service.loopingcall [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 575.574025] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 575.574025] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aaa5631c-19f7-44d8-9ac8-404710b3dd23 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.603727] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 575.603727] env[67820]: value = "task-3467286" [ 575.603727] env[67820]: _type = "Task" [ 575.603727] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 575.611576] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467286, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 575.629537] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.629844] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.630108] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 575.630257] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 575.661417] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.661577] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.661711] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.662074] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.662074] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.663438] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.663712] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.663856] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 575.663987] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 575.664929] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.664929] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.665093] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.665436] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.665436] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.665612] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.665966] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 575.665966] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 575.682034] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.682034] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.682034] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.682034] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 575.684019] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2902ade0-54a8-443b-b917-d988eacf12a4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.693843] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50320b7f-1731-4673-9793-5c04b436823b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.717317] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d984cbd-8469-4b40-9f57-13cbbce3f3d2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.724779] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f61bd07-e824-475f-b2f2-2d415cc074aa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.756625] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180949MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 575.756764] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.757141] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.877931] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d08e77ae-af85-4dfe-86e7-60f850369485 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.878841] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ca9f607c-cde9-459e-aa9e-4b060bc8a68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.879016] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11335aca-6576-4d27-b50b-95ecc0c157a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.879153] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.879310] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 14478951-d2c1-4472-af0c-354757e0bb0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.879504] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance be35d888-f649-44e4-af23-341b8bfc81f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.879665] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 573a28e3-bfc4-4b08-919b-65acbca79c7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.880092] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1cc3b207-a628-4fe5-8908-6879483806b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 575.880340] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 575.880497] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 575.976678] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Updating instance_info_cache with network_info: [{"id": "e3a51c96-4813-4e0d-87c6-1c953da1e331", "address": "fa:16:3e:4a:b2:c1", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3a51c96-48", "ovs_interfaceid": "e3a51c96-4813-4e0d-87c6-1c953da1e331", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.996360] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Releasing lock "refresh_cache-573a28e3-bfc4-4b08-919b-65acbca79c7b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 575.998267] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Instance network_info: |[{"id": "e3a51c96-4813-4e0d-87c6-1c953da1e331", "address": "fa:16:3e:4a:b2:c1", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3a51c96-48", "ovs_interfaceid": "e3a51c96-4813-4e0d-87c6-1c953da1e331", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 575.998513] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4a:b2:c1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e3a51c96-4813-4e0d-87c6-1c953da1e331', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 576.009789] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Creating folder: Project (8b5b33b995674e3aae09667f533954b9). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 576.013105] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-73ee1c8b-49fc-4e73-958b-8ee383cdcb4d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.024674] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Created folder: Project (8b5b33b995674e3aae09667f533954b9) in parent group-v692668. [ 576.024674] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Creating folder: Instances. Parent ref: group-v692687. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 576.026019] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-219d9fb7-0954-47ec-ae26-79f4047144cf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.036834] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Created folder: Instances in parent group-v692687. [ 576.037156] env[67820]: DEBUG oslo.service.loopingcall [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 576.037796] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 576.037796] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f5d6e771-76af-47bf-a26e-9a791ff975c2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.058498] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab1d1814-3f42-41e5-8f3a-4bc79a9d4367 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.064678] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 576.064678] env[67820]: value = "task-3467289" [ 576.064678] env[67820]: _type = "Task" [ 576.064678] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 576.075052] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467289, 'name': CreateVM_Task} progress is 6%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 576.078063] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a15bd1db-93c9-436c-8937-cb3cf3f01ae7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.122548] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce33f062-bd9f-489d-aa24-787cf18c0daa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.131694] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467286, 'name': CreateVM_Task} progress is 99%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 576.135612] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8dd473a-769e-4882-863c-069d80a077fa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.152178] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 576.164060] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 576.196455] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 576.196455] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.439s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 576.578560] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467289, 'name': CreateVM_Task, 'duration_secs': 0.417364} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 576.581122] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 576.581122] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 576.581122] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 576.581122] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 576.581122] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fb22535c-b84b-42f7-9033-2defe9c2d94b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 576.585115] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Waiting for the task: (returnval){ [ 576.585115] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]529298c6-9d81-1180-fc9e-9f90c37ec360" [ 576.585115] env[67820]: _type = "Task" [ 576.585115] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 576.599685] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]529298c6-9d81-1180-fc9e-9f90c37ec360, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 576.627018] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467286, 'name': CreateVM_Task, 'duration_secs': 0.530855} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 576.627018] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 576.627587] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.081274] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Successfully updated port: 5e92dc3b-a08e-4e33-b663-d2ad36167103 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 577.099717] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 577.100738] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 577.102038] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.102038] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 577.102556] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 577.103089] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3e5c1d12-122e-4a6e-a881-6046d5e7399d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.109210] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "refresh_cache-1cc3b207-a628-4fe5-8908-6879483806b9" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.109210] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquired lock "refresh_cache-1cc3b207-a628-4fe5-8908-6879483806b9" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 577.109210] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 577.118457] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Waiting for the task: (returnval){ [ 577.118457] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]523d6181-1b46-3e04-910c-d41e12da0033" [ 577.118457] env[67820]: _type = "Task" [ 577.118457] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 577.131539] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]523d6181-1b46-3e04-910c-d41e12da0033, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 577.181996] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Received event network-vif-plugged-75d491ce-6569-4490-80f6-62f2ff27a2db {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 577.181996] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquiring lock "ca9f607c-cde9-459e-aa9e-4b060bc8a68b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.181996] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Lock "ca9f607c-cde9-459e-aa9e-4b060bc8a68b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.181996] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Lock "ca9f607c-cde9-459e-aa9e-4b060bc8a68b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 577.182490] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] No waiting events found dispatching network-vif-plugged-75d491ce-6569-4490-80f6-62f2ff27a2db {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 577.182490] env[67820]: WARNING nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Received unexpected event network-vif-plugged-75d491ce-6569-4490-80f6-62f2ff27a2db for instance with vm_state building and task_state spawning. [ 577.182490] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Received event network-changed-5dba3972-d1aa-4f24-a053-15173a6a1b17 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 577.182490] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Refreshing instance network info cache due to event network-changed-5dba3972-d1aa-4f24-a053-15173a6a1b17. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 577.182490] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquiring lock "refresh_cache-d08e77ae-af85-4dfe-86e7-60f850369485" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.182637] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquired lock "refresh_cache-d08e77ae-af85-4dfe-86e7-60f850369485" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 577.182637] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Refreshing network info cache for port 5dba3972-d1aa-4f24-a053-15173a6a1b17 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 577.199141] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "80f480dc-9bb8-4764-9b6b-793c0954962e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.199141] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.754922] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 577.770491] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 577.770491] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 577.770491] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.771381] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 577.848357] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.848569] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.850865] env[67820]: INFO nova.compute.claims [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 578.123661] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-857064e1-c3e3-4436-861c-13bf7d4025e7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.132694] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bfb36cf-2f05-41da-aa59-c298e591fd35 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.174735] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cf2cc9b-0cc7-4a8c-af2b-4761f40fed8e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.182717] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6832bf9d-e5d9-45b4-a1fe-f1d9e189e26e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.199428] env[67820]: DEBUG nova.compute.provider_tree [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 578.210790] env[67820]: DEBUG nova.scheduler.client.report [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 578.237170] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.389s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 578.237710] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 578.264553] env[67820]: DEBUG nova.compute.manager [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Received event network-changed-b3253af5-845d-47d7-910e-44ddfa9076cf {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 578.264819] env[67820]: DEBUG nova.compute.manager [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Refreshing instance network info cache due to event network-changed-b3253af5-845d-47d7-910e-44ddfa9076cf. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 578.264974] env[67820]: DEBUG oslo_concurrency.lockutils [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] Acquiring lock "refresh_cache-14478951-d2c1-4472-af0c-354757e0bb0b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 578.266052] env[67820]: DEBUG oslo_concurrency.lockutils [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] Acquired lock "refresh_cache-14478951-d2c1-4472-af0c-354757e0bb0b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 578.266738] env[67820]: DEBUG nova.network.neutron [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Refreshing network info cache for port b3253af5-845d-47d7-910e-44ddfa9076cf {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 578.300437] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Updating instance_info_cache with network_info: [{"id": "5e92dc3b-a08e-4e33-b663-d2ad36167103", "address": "fa:16:3e:ef:3e:22", "network": {"id": "6463dc42-b48e-476b-85b0-786f7793e7a9", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-381911185-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45335f4ca9ab47218fc21f73d66c7c4f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dabbac20-1723-40ad-9da0-e53b28073651", "external-id": "nsx-vlan-transportzone-790", "segmentation_id": 790, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e92dc3b-a0", "ovs_interfaceid": "5e92dc3b-a08e-4e33-b663-d2ad36167103", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 578.305883] env[67820]: DEBUG nova.compute.utils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 578.308423] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 578.308712] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 578.322607] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Releasing lock "refresh_cache-1cc3b207-a628-4fe5-8908-6879483806b9" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 578.322958] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Instance network_info: |[{"id": "5e92dc3b-a08e-4e33-b663-d2ad36167103", "address": "fa:16:3e:ef:3e:22", "network": {"id": "6463dc42-b48e-476b-85b0-786f7793e7a9", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-381911185-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45335f4ca9ab47218fc21f73d66c7c4f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dabbac20-1723-40ad-9da0-e53b28073651", "external-id": "nsx-vlan-transportzone-790", "segmentation_id": 790, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e92dc3b-a0", "ovs_interfaceid": "5e92dc3b-a08e-4e33-b663-d2ad36167103", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 578.323418] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 578.329862] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ef:3e:22', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dabbac20-1723-40ad-9da0-e53b28073651', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5e92dc3b-a08e-4e33-b663-d2ad36167103', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 578.338276] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Creating folder: Project (45335f4ca9ab47218fc21f73d66c7c4f). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 578.339128] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6fc90e9e-d3b9-4db7-aee5-faa5b35cb8fb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.351239] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Created folder: Project (45335f4ca9ab47218fc21f73d66c7c4f) in parent group-v692668. [ 578.351433] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Creating folder: Instances. Parent ref: group-v692690. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 578.351674] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9e956282-f298-4f36-8802-9e657c8a61a4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.364586] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Created folder: Instances in parent group-v692690. [ 578.364830] env[67820]: DEBUG oslo.service.loopingcall [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 578.365025] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 578.365231] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3bde8646-3c37-4b79-8c66-750db553e2b2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.394840] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 578.394840] env[67820]: value = "task-3467292" [ 578.394840] env[67820]: _type = "Task" [ 578.394840] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 578.407940] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467292, 'name': CreateVM_Task} progress is 6%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 578.437023] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 578.471285] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 578.471550] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 578.471715] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 578.471893] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 578.472053] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 578.472207] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 578.472411] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 578.473628] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 578.473628] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 578.473628] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 578.473628] env[67820]: DEBUG nova.virt.hardware [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 578.477773] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ed1c00e-4ab5-40e7-9a18-aba8fdab5138 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.482103] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-570a1f91-308a-4a5c-90ad-99826bc2cf53 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 578.649415] env[67820]: DEBUG nova.policy [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1cfe1c5c465740f9a298f578191b610f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f5e2fd2445af40d98840b854a6a50ff9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 578.908542] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467292, 'name': CreateVM_Task} progress is 99%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 579.412603] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467292, 'name': CreateVM_Task, 'duration_secs': 0.545556} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 579.412603] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 579.412603] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.412603] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 579.412928] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 579.413753] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da3c9ba5-65e0-4021-866a-25d74369b91b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 579.419783] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Updated VIF entry in instance network info cache for port 5dba3972-d1aa-4f24-a053-15173a6a1b17. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 579.420308] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Updating instance_info_cache with network_info: [{"id": "5dba3972-d1aa-4f24-a053-15173a6a1b17", "address": "fa:16:3e:5f:a1:73", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5dba3972-d1", "ovs_interfaceid": "5dba3972-d1aa-4f24-a053-15173a6a1b17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 579.428659] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Waiting for the task: (returnval){ [ 579.428659] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52f31a79-36b9-a440-bd45-fcb87fb1df91" [ 579.428659] env[67820]: _type = "Task" [ 579.428659] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 579.444762] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52f31a79-36b9-a440-bd45-fcb87fb1df91, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 579.448126] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Releasing lock "refresh_cache-d08e77ae-af85-4dfe-86e7-60f850369485" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 579.448977] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Received event network-changed-75d491ce-6569-4490-80f6-62f2ff27a2db {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 579.449351] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Refreshing instance network info cache due to event network-changed-75d491ce-6569-4490-80f6-62f2ff27a2db. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 579.449666] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquiring lock "refresh_cache-ca9f607c-cde9-459e-aa9e-4b060bc8a68b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 579.450130] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquired lock "refresh_cache-ca9f607c-cde9-459e-aa9e-4b060bc8a68b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 579.450292] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Refreshing network info cache for port 75d491ce-6569-4490-80f6-62f2ff27a2db {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 579.695193] env[67820]: DEBUG nova.network.neutron [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Updated VIF entry in instance network info cache for port b3253af5-845d-47d7-910e-44ddfa9076cf. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 579.695193] env[67820]: DEBUG nova.network.neutron [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Updating instance_info_cache with network_info: [{"id": "b3253af5-845d-47d7-910e-44ddfa9076cf", "address": "fa:16:3e:c0:1d:79", "network": {"id": "8782dc75-c23f-40f8-b98d-da109d55df53", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-874469868-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fddbbfb428ea4faba97457968105ad2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31a7f15-a808-4199-9071-31fd05e316ea", "external-id": "nsx-vlan-transportzone-388", "segmentation_id": 388, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb3253af5-84", "ovs_interfaceid": "b3253af5-845d-47d7-910e-44ddfa9076cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 579.712623] env[67820]: DEBUG oslo_concurrency.lockutils [req-583dbea7-fe90-4b52-a59d-35a90e532767 req-ccc61d52-8879-4a97-be74-890cd50f2ebc service nova] Releasing lock "refresh_cache-14478951-d2c1-4472-af0c-354757e0bb0b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 579.945389] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 579.946954] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 579.947435] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 580.202386] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Successfully created port: b19cc479-df25-4b66-a6ad-93d584c8d1c1 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 580.218739] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.218739] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.246966] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 580.273502] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.273738] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.334193] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.334659] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.337223] env[67820]: INFO nova.compute.claims [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 580.341737] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.342203] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.464475] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Updated VIF entry in instance network info cache for port 75d491ce-6569-4490-80f6-62f2ff27a2db. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 580.464475] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Updating instance_info_cache with network_info: [{"id": "75d491ce-6569-4490-80f6-62f2ff27a2db", "address": "fa:16:3e:34:5d:8c", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.188", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap75d491ce-65", "ovs_interfaceid": "75d491ce-6569-4490-80f6-62f2ff27a2db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 580.484098] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Releasing lock "refresh_cache-ca9f607c-cde9-459e-aa9e-4b060bc8a68b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 580.485021] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Received event network-vif-plugged-2455c43c-ba7c-4799-be60-baec0227f246 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 580.487293] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquiring lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 580.487293] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 580.487293] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.487293] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] No waiting events found dispatching network-vif-plugged-2455c43c-ba7c-4799-be60-baec0227f246 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 580.487427] env[67820]: WARNING nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Received unexpected event network-vif-plugged-2455c43c-ba7c-4799-be60-baec0227f246 for instance with vm_state building and task_state spawning. [ 580.487427] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Received event network-changed-2455c43c-ba7c-4799-be60-baec0227f246 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 580.487427] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Refreshing instance network info cache due to event network-changed-2455c43c-ba7c-4799-be60-baec0227f246. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 580.488243] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquiring lock "refresh_cache-0421f92d-8c91-4a60-beb9-f1a799e6d1b4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 580.490078] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquired lock "refresh_cache-0421f92d-8c91-4a60-beb9-f1a799e6d1b4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 580.490078] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Refreshing network info cache for port 2455c43c-ba7c-4799-be60-baec0227f246 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 580.723666] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15ffa21a-bd78-48ed-a897-85d5df07e779 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.732936] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d15dd22f-e264-4b2e-91a3-a32bff7356cc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.769150] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2c6dbf8-a745-464a-808d-caad05f70497 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.777149] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eadabfc3-6a7b-4591-9b48-1f34e3bf801e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 580.792888] env[67820]: DEBUG nova.compute.provider_tree [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 580.809939] env[67820]: DEBUG nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 580.838878] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.504s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 580.839347] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 580.894377] env[67820]: DEBUG nova.compute.utils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 580.895645] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 580.899135] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 580.913067] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 581.022856] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 581.056376] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 581.057112] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 581.057112] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 581.057658] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 581.058014] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 581.058014] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 581.058204] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 581.058388] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 581.058557] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 581.058725] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 581.058909] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 581.059770] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3eac88c-9af8-41fa-bd63-bca51edf1719 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.070488] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe4518a4-3c11-4b67-bc8b-e5637326e3cb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 581.075635] env[67820]: DEBUG nova.policy [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2aa367ffec7d4b3caa69171ba56159b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c89a428d22540a29a3801ff4639145c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 581.749511] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Updated VIF entry in instance network info cache for port 2455c43c-ba7c-4799-be60-baec0227f246. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 581.749884] env[67820]: DEBUG nova.network.neutron [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Updating instance_info_cache with network_info: [{"id": "2455c43c-ba7c-4799-be60-baec0227f246", "address": "fa:16:3e:ba:00:77", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2455c43c-ba", "ovs_interfaceid": "2455c43c-ba7c-4799-be60-baec0227f246", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 581.765250] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Releasing lock "refresh_cache-0421f92d-8c91-4a60-beb9-f1a799e6d1b4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 581.765441] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Received event network-vif-plugged-f518be0c-8624-4aef-8248-c09544c76756 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 581.765637] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Acquiring lock "be35d888-f649-44e4-af23-341b8bfc81f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 581.765926] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Lock "be35d888-f649-44e4-af23-341b8bfc81f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 581.766190] env[67820]: DEBUG oslo_concurrency.lockutils [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] Lock "be35d888-f649-44e4-af23-341b8bfc81f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 581.766190] env[67820]: DEBUG nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] No waiting events found dispatching network-vif-plugged-f518be0c-8624-4aef-8248-c09544c76756 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 581.766310] env[67820]: WARNING nova.compute.manager [req-508884b1-abdb-4d0c-bba0-625f148b65e2 req-c3bc0e40-8ab2-4d06-a0d4-ca952854926c service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Received unexpected event network-vif-plugged-f518be0c-8624-4aef-8248-c09544c76756 for instance with vm_state building and task_state spawning. [ 582.036750] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Successfully updated port: b19cc479-df25-4b66-a6ad-93d584c8d1c1 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 582.055269] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "refresh_cache-80f480dc-9bb8-4764-9b6b-793c0954962e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.056090] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquired lock "refresh_cache-80f480dc-9bb8-4764-9b6b-793c0954962e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.056090] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 582.287552] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 582.296179] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Successfully created port: 03b0f437-a163-4460-bd44-3a63a9131f48 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 582.518645] env[67820]: DEBUG nova.compute.manager [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Received event network-vif-plugged-e3a51c96-4813-4e0d-87c6-1c953da1e331 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 582.519066] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Acquiring lock "573a28e3-bfc4-4b08-919b-65acbca79c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.519643] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.520491] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.520610] env[67820]: DEBUG nova.compute.manager [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] No waiting events found dispatching network-vif-plugged-e3a51c96-4813-4e0d-87c6-1c953da1e331 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 582.520884] env[67820]: WARNING nova.compute.manager [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Received unexpected event network-vif-plugged-e3a51c96-4813-4e0d-87c6-1c953da1e331 for instance with vm_state building and task_state spawning. [ 582.521297] env[67820]: DEBUG nova.compute.manager [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Received event network-changed-f518be0c-8624-4aef-8248-c09544c76756 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 582.523044] env[67820]: DEBUG nova.compute.manager [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Refreshing instance network info cache due to event network-changed-f518be0c-8624-4aef-8248-c09544c76756. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 582.523044] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Acquiring lock "refresh_cache-be35d888-f649-44e4-af23-341b8bfc81f6" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.523044] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Acquired lock "refresh_cache-be35d888-f649-44e4-af23-341b8bfc81f6" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.523044] env[67820]: DEBUG nova.network.neutron [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Refreshing network info cache for port f518be0c-8624-4aef-8248-c09544c76756 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 582.705725] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Updating instance_info_cache with network_info: [{"id": "b19cc479-df25-4b66-a6ad-93d584c8d1c1", "address": "fa:16:3e:6c:d5:fc", "network": {"id": "1f370e52-9a90-4dbb-a769-629fedc29049", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1745213521-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f5e2fd2445af40d98840b854a6a50ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a1895250-76cc-41f7-b7f8-2e5679494607", "external-id": "nsx-vlan-transportzone-785", "segmentation_id": 785, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb19cc479-df", "ovs_interfaceid": "b19cc479-df25-4b66-a6ad-93d584c8d1c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 582.708051] env[67820]: DEBUG nova.compute.manager [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Received event network-vif-plugged-5e92dc3b-a08e-4e33-b663-d2ad36167103 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 582.710734] env[67820]: DEBUG oslo_concurrency.lockutils [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] Acquiring lock "1cc3b207-a628-4fe5-8908-6879483806b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 582.710734] env[67820]: DEBUG oslo_concurrency.lockutils [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] Lock "1cc3b207-a628-4fe5-8908-6879483806b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 582.710734] env[67820]: DEBUG oslo_concurrency.lockutils [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] Lock "1cc3b207-a628-4fe5-8908-6879483806b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 582.710734] env[67820]: DEBUG nova.compute.manager [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] No waiting events found dispatching network-vif-plugged-5e92dc3b-a08e-4e33-b663-d2ad36167103 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 582.711034] env[67820]: WARNING nova.compute.manager [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Received unexpected event network-vif-plugged-5e92dc3b-a08e-4e33-b663-d2ad36167103 for instance with vm_state building and task_state spawning. [ 582.711034] env[67820]: DEBUG nova.compute.manager [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Received event network-changed-5e92dc3b-a08e-4e33-b663-d2ad36167103 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 582.711034] env[67820]: DEBUG nova.compute.manager [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Refreshing instance network info cache due to event network-changed-5e92dc3b-a08e-4e33-b663-d2ad36167103. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 582.711034] env[67820]: DEBUG oslo_concurrency.lockutils [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] Acquiring lock "refresh_cache-1cc3b207-a628-4fe5-8908-6879483806b9" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 582.711034] env[67820]: DEBUG oslo_concurrency.lockutils [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] Acquired lock "refresh_cache-1cc3b207-a628-4fe5-8908-6879483806b9" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 582.711230] env[67820]: DEBUG nova.network.neutron [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Refreshing network info cache for port 5e92dc3b-a08e-4e33-b663-d2ad36167103 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 582.722212] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Releasing lock "refresh_cache-80f480dc-9bb8-4764-9b6b-793c0954962e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 582.722962] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Instance network_info: |[{"id": "b19cc479-df25-4b66-a6ad-93d584c8d1c1", "address": "fa:16:3e:6c:d5:fc", "network": {"id": "1f370e52-9a90-4dbb-a769-629fedc29049", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1745213521-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f5e2fd2445af40d98840b854a6a50ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a1895250-76cc-41f7-b7f8-2e5679494607", "external-id": "nsx-vlan-transportzone-785", "segmentation_id": 785, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb19cc479-df", "ovs_interfaceid": "b19cc479-df25-4b66-a6ad-93d584c8d1c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 582.726210] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:d5:fc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a1895250-76cc-41f7-b7f8-2e5679494607', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b19cc479-df25-4b66-a6ad-93d584c8d1c1', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 582.734250] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Creating folder: Project (f5e2fd2445af40d98840b854a6a50ff9). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 582.738075] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ac153d7-e564-4dd5-bb02-0e1296a0d49a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.750533] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Created folder: Project (f5e2fd2445af40d98840b854a6a50ff9) in parent group-v692668. [ 582.753131] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Creating folder: Instances. Parent ref: group-v692693. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 582.753131] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9098341b-bcb8-4efa-91a4-0123a546b823 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.763422] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Created folder: Instances in parent group-v692693. [ 582.766066] env[67820]: DEBUG oslo.service.loopingcall [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 582.766066] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 582.766066] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-07c0249c-9f0c-44c2-b278-f2d687fbee68 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 582.787255] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 582.787255] env[67820]: value = "task-3467295" [ 582.787255] env[67820]: _type = "Task" [ 582.787255] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 582.795479] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467295, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 583.297234] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467295, 'name': CreateVM_Task, 'duration_secs': 0.317129} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 583.297602] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 583.298110] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.298276] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 583.298584] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 583.298837] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-528ce176-c59b-47cc-b17b-2353a0d478aa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 583.304200] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Waiting for the task: (returnval){ [ 583.304200] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]525e38eb-cd25-7cc0-dc90-c3f4805488ba" [ 583.304200] env[67820]: _type = "Task" [ 583.304200] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 583.313283] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]525e38eb-cd25-7cc0-dc90-c3f4805488ba, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 583.452220] env[67820]: DEBUG nova.network.neutron [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Updated VIF entry in instance network info cache for port f518be0c-8624-4aef-8248-c09544c76756. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 583.452530] env[67820]: DEBUG nova.network.neutron [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Updating instance_info_cache with network_info: [{"id": "f518be0c-8624-4aef-8248-c09544c76756", "address": "fa:16:3e:b3:6a:fc", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf518be0c-86", "ovs_interfaceid": "f518be0c-8624-4aef-8248-c09544c76756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.471554] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Releasing lock "refresh_cache-be35d888-f649-44e4-af23-341b8bfc81f6" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 583.475523] env[67820]: DEBUG nova.compute.manager [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Received event network-changed-e3a51c96-4813-4e0d-87c6-1c953da1e331 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 583.475523] env[67820]: DEBUG nova.compute.manager [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Refreshing instance network info cache due to event network-changed-e3a51c96-4813-4e0d-87c6-1c953da1e331. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 583.475523] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Acquiring lock "refresh_cache-573a28e3-bfc4-4b08-919b-65acbca79c7b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.475523] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Acquired lock "refresh_cache-573a28e3-bfc4-4b08-919b-65acbca79c7b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 583.475523] env[67820]: DEBUG nova.network.neutron [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Refreshing network info cache for port e3a51c96-4813-4e0d-87c6-1c953da1e331 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 583.485610] env[67820]: DEBUG nova.network.neutron [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Updated VIF entry in instance network info cache for port 5e92dc3b-a08e-4e33-b663-d2ad36167103. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 583.486705] env[67820]: DEBUG nova.network.neutron [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Updating instance_info_cache with network_info: [{"id": "5e92dc3b-a08e-4e33-b663-d2ad36167103", "address": "fa:16:3e:ef:3e:22", "network": {"id": "6463dc42-b48e-476b-85b0-786f7793e7a9", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-381911185-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45335f4ca9ab47218fc21f73d66c7c4f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dabbac20-1723-40ad-9da0-e53b28073651", "external-id": "nsx-vlan-transportzone-790", "segmentation_id": 790, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e92dc3b-a0", "ovs_interfaceid": "5e92dc3b-a08e-4e33-b663-d2ad36167103", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.501482] env[67820]: DEBUG oslo_concurrency.lockutils [req-2db4a8b5-e25b-4c97-b98b-e3f0ab79e978 req-e959763f-08de-4989-af0a-88ea8b52b94d service nova] Releasing lock "refresh_cache-1cc3b207-a628-4fe5-8908-6879483806b9" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 583.817531] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 583.818660] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 583.818660] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.847869] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Successfully updated port: 03b0f437-a163-4460-bd44-3a63a9131f48 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 583.870313] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "refresh_cache-f1fcb6fc-97d9-46ed-ae53-a27f58992378" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.870453] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "refresh_cache-f1fcb6fc-97d9-46ed-ae53-a27f58992378" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 583.870600] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 583.938210] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.969355] env[67820]: DEBUG nova.network.neutron [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Updated VIF entry in instance network info cache for port e3a51c96-4813-4e0d-87c6-1c953da1e331. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 583.969355] env[67820]: DEBUG nova.network.neutron [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Updating instance_info_cache with network_info: [{"id": "e3a51c96-4813-4e0d-87c6-1c953da1e331", "address": "fa:16:3e:4a:b2:c1", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3a51c96-48", "ovs_interfaceid": "e3a51c96-4813-4e0d-87c6-1c953da1e331", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.984782] env[67820]: DEBUG oslo_concurrency.lockutils [req-c24e29c5-4356-424c-b7db-d0bfd9c73d2d req-d00b10d4-20aa-40a6-94a8-db1ae9eb1cf2 service nova] Releasing lock "refresh_cache-573a28e3-bfc4-4b08-919b-65acbca79c7b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 584.386547] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Updating instance_info_cache with network_info: [{"id": "03b0f437-a163-4460-bd44-3a63a9131f48", "address": "fa:16:3e:78:2f:34", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03b0f437-a1", "ovs_interfaceid": "03b0f437-a163-4460-bd44-3a63a9131f48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 584.406080] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "refresh_cache-f1fcb6fc-97d9-46ed-ae53-a27f58992378" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 584.406080] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Instance network_info: |[{"id": "03b0f437-a163-4460-bd44-3a63a9131f48", "address": "fa:16:3e:78:2f:34", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03b0f437-a1", "ovs_interfaceid": "03b0f437-a163-4460-bd44-3a63a9131f48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 584.406348] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:2f:34', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a10c88d7-d13f-44fd-acee-7a734eb5f56a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '03b0f437-a163-4460-bd44-3a63a9131f48', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 584.416205] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating folder: Project (7c89a428d22540a29a3801ff4639145c). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 584.417559] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0b1ea40f-4a9e-4889-8a6c-0b990a42d8ff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.433372] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created folder: Project (7c89a428d22540a29a3801ff4639145c) in parent group-v692668. [ 584.433577] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating folder: Instances. Parent ref: group-v692696. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 584.433827] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-86e6fc3c-dbf6-4891-b9af-523915743f48 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.445590] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created folder: Instances in parent group-v692696. [ 584.446088] env[67820]: DEBUG oslo.service.loopingcall [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 584.446088] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 584.446223] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-317370ca-5280-49ed-b42f-7549e4b63937 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.472422] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 584.472422] env[67820]: value = "task-3467298" [ 584.472422] env[67820]: _type = "Task" [ 584.472422] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 584.483484] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467298, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 584.984565] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467298, 'name': CreateVM_Task, 'duration_secs': 0.306664} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 584.984707] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 584.985644] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 584.985838] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 584.986828] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 584.986828] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ae9aa577-bfdf-4691-b825-3fa8844c2cfa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 584.993184] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 584.993184] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52370712-5f60-4369-829d-375d91c84979" [ 584.993184] env[67820]: _type = "Task" [ 584.993184] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 585.002713] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52370712-5f60-4369-829d-375d91c84979, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 585.506188] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 585.506726] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 585.506769] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.712225] env[67820]: DEBUG nova.compute.manager [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Received event network-vif-plugged-03b0f437-a163-4460-bd44-3a63a9131f48 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 585.712225] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] Acquiring lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.712225] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.712225] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 585.713403] env[67820]: DEBUG nova.compute.manager [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] No waiting events found dispatching network-vif-plugged-03b0f437-a163-4460-bd44-3a63a9131f48 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 585.714030] env[67820]: WARNING nova.compute.manager [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Received unexpected event network-vif-plugged-03b0f437-a163-4460-bd44-3a63a9131f48 for instance with vm_state building and task_state spawning. [ 585.714802] env[67820]: DEBUG nova.compute.manager [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Received event network-changed-03b0f437-a163-4460-bd44-3a63a9131f48 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 585.714802] env[67820]: DEBUG nova.compute.manager [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Refreshing instance network info cache due to event network-changed-03b0f437-a163-4460-bd44-3a63a9131f48. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 585.714802] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] Acquiring lock "refresh_cache-f1fcb6fc-97d9-46ed-ae53-a27f58992378" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.715853] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] Acquired lock "refresh_cache-f1fcb6fc-97d9-46ed-ae53-a27f58992378" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.716170] env[67820]: DEBUG nova.network.neutron [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Refreshing network info cache for port 03b0f437-a163-4460-bd44-3a63a9131f48 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 585.804863] env[67820]: DEBUG nova.compute.manager [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Received event network-vif-plugged-b19cc479-df25-4b66-a6ad-93d584c8d1c1 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 585.805255] env[67820]: DEBUG oslo_concurrency.lockutils [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] Acquiring lock "80f480dc-9bb8-4764-9b6b-793c0954962e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.807514] env[67820]: DEBUG oslo_concurrency.lockutils [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.807514] env[67820]: DEBUG oslo_concurrency.lockutils [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 585.807514] env[67820]: DEBUG nova.compute.manager [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] No waiting events found dispatching network-vif-plugged-b19cc479-df25-4b66-a6ad-93d584c8d1c1 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 585.807514] env[67820]: WARNING nova.compute.manager [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Received unexpected event network-vif-plugged-b19cc479-df25-4b66-a6ad-93d584c8d1c1 for instance with vm_state building and task_state spawning. [ 585.808074] env[67820]: DEBUG nova.compute.manager [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Received event network-changed-b19cc479-df25-4b66-a6ad-93d584c8d1c1 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 585.808074] env[67820]: DEBUG nova.compute.manager [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Refreshing instance network info cache due to event network-changed-b19cc479-df25-4b66-a6ad-93d584c8d1c1. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 585.808074] env[67820]: DEBUG oslo_concurrency.lockutils [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] Acquiring lock "refresh_cache-80f480dc-9bb8-4764-9b6b-793c0954962e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 585.808074] env[67820]: DEBUG oslo_concurrency.lockutils [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] Acquired lock "refresh_cache-80f480dc-9bb8-4764-9b6b-793c0954962e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 585.808074] env[67820]: DEBUG nova.network.neutron [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Refreshing network info cache for port b19cc479-df25-4b66-a6ad-93d584c8d1c1 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 586.162301] env[67820]: DEBUG nova.network.neutron [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Updated VIF entry in instance network info cache for port b19cc479-df25-4b66-a6ad-93d584c8d1c1. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 586.162713] env[67820]: DEBUG nova.network.neutron [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Updating instance_info_cache with network_info: [{"id": "b19cc479-df25-4b66-a6ad-93d584c8d1c1", "address": "fa:16:3e:6c:d5:fc", "network": {"id": "1f370e52-9a90-4dbb-a769-629fedc29049", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1745213521-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f5e2fd2445af40d98840b854a6a50ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a1895250-76cc-41f7-b7f8-2e5679494607", "external-id": "nsx-vlan-transportzone-785", "segmentation_id": 785, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb19cc479-df", "ovs_interfaceid": "b19cc479-df25-4b66-a6ad-93d584c8d1c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.175483] env[67820]: DEBUG oslo_concurrency.lockutils [req-ee91983e-81a9-4615-9728-c5ae4428731d req-575fee0a-8b57-40ca-87c0-191a2eae444e service nova] Releasing lock "refresh_cache-80f480dc-9bb8-4764-9b6b-793c0954962e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 586.596141] env[67820]: DEBUG nova.network.neutron [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Updated VIF entry in instance network info cache for port 03b0f437-a163-4460-bd44-3a63a9131f48. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 586.596141] env[67820]: DEBUG nova.network.neutron [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Updating instance_info_cache with network_info: [{"id": "03b0f437-a163-4460-bd44-3a63a9131f48", "address": "fa:16:3e:78:2f:34", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03b0f437-a1", "ovs_interfaceid": "03b0f437-a163-4460-bd44-3a63a9131f48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.608709] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e3a3383-0346-4b7b-9ae1-811cb90a28e6 req-88ce5066-c54b-4d25-8dec-cc1b785eb400 service nova] Releasing lock "refresh_cache-f1fcb6fc-97d9-46ed-ae53-a27f58992378" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 589.190059] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.190685] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 592.895071] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 592.895384] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.111111] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efc88ee7-caa1-4bc4-bac7-632d96f2c478 tempest-ServersAdminNegativeTestJSON-752970805 tempest-ServersAdminNegativeTestJSON-752970805-project-member] Acquiring lock "8909b0c9-f236-4d22-b3f1-3bf15b82aa0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.111458] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efc88ee7-caa1-4bc4-bac7-632d96f2c478 tempest-ServersAdminNegativeTestJSON-752970805 tempest-ServersAdminNegativeTestJSON-752970805-project-member] Lock "8909b0c9-f236-4d22-b3f1-3bf15b82aa0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 595.198920] env[67820]: DEBUG oslo_concurrency.lockutils [None req-cfd23513-cb83-4b8e-90e6-eb9e5ae733d3 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Acquiring lock "ed58f91c-e284-4877-8d7c-f1f9e9f1add8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 595.199542] env[67820]: DEBUG oslo_concurrency.lockutils [None req-cfd23513-cb83-4b8e-90e6-eb9e5ae733d3 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "ed58f91c-e284-4877-8d7c-f1f9e9f1add8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 596.018481] env[67820]: DEBUG oslo_concurrency.lockutils [None req-226f1a8b-62a1-44b3-8f59-ea4a403d7d54 tempest-ServersAdmin275Test-1182303969 tempest-ServersAdmin275Test-1182303969-project-member] Acquiring lock "3ed56b60-f2a0-4ada-ace0-3b85d93693f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 596.018481] env[67820]: DEBUG oslo_concurrency.lockutils [None req-226f1a8b-62a1-44b3-8f59-ea4a403d7d54 tempest-ServersAdmin275Test-1182303969 tempest-ServersAdmin275Test-1182303969-project-member] Lock "3ed56b60-f2a0-4ada-ace0-3b85d93693f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 597.336869] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e8b4a595-43d9-49eb-b7c4-4f3887c6469d tempest-ServersWithSpecificFlavorTestJSON-693409077 tempest-ServersWithSpecificFlavorTestJSON-693409077-project-member] Acquiring lock "7fc2bae2-34a4-472a-a097-f93245de32bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 597.337170] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e8b4a595-43d9-49eb-b7c4-4f3887c6469d tempest-ServersWithSpecificFlavorTestJSON-693409077 tempest-ServersWithSpecificFlavorTestJSON-693409077-project-member] Lock "7fc2bae2-34a4-472a-a097-f93245de32bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 598.385289] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4d4b2146-9c6e-4365-adfa-a75355122047 tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] Acquiring lock "18d924ce-620d-45d1-92cf-3f8cfa5a81b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.387882] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4d4b2146-9c6e-4365-adfa-a75355122047 tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] Lock "18d924ce-620d-45d1-92cf-3f8cfa5a81b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 598.545183] env[67820]: DEBUG oslo_concurrency.lockutils [None req-11420d02-e21b-4a3d-819f-c540282d5490 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] Acquiring lock "66add717-36cf-4328-8d8c-32beb0eca333" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.545795] env[67820]: DEBUG oslo_concurrency.lockutils [None req-11420d02-e21b-4a3d-819f-c540282d5490 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] Lock "66add717-36cf-4328-8d8c-32beb0eca333" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.381680] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7f33c58f-d3c2-4536-b751-42f9e23b550f tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] Acquiring lock "5a22e980-f0c4-4fca-a4bf-e5c1347e9b80" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.381938] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7f33c58f-d3c2-4536-b751-42f9e23b550f tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] Lock "5a22e980-f0c4-4fca-a4bf-e5c1347e9b80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.787038] env[67820]: DEBUG oslo_concurrency.lockutils [None req-66665645-914b-49c0-ac8a-35ec51f6ee00 tempest-ServersTestJSON-243953592 tempest-ServersTestJSON-243953592-project-member] Acquiring lock "91d81e24-4166-4135-9a3b-fe117bdf9c2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.788200] env[67820]: DEBUG oslo_concurrency.lockutils [None req-66665645-914b-49c0-ac8a-35ec51f6ee00 tempest-ServersTestJSON-243953592 tempest-ServersTestJSON-243953592-project-member] Lock "91d81e24-4166-4135-9a3b-fe117bdf9c2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.120512] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1c5c390a-9fce-4725-928b-bc6ea81abc4e tempest-ServerDiagnosticsNegativeTest-1545651443 tempest-ServerDiagnosticsNegativeTest-1545651443-project-member] Acquiring lock "d41bef54-918d-4a67-9666-4beb8bd1d1dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.121197] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1c5c390a-9fce-4725-928b-bc6ea81abc4e tempest-ServerDiagnosticsNegativeTest-1545651443 tempest-ServerDiagnosticsNegativeTest-1545651443-project-member] Lock "d41bef54-918d-4a67-9666-4beb8bd1d1dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.297198] env[67820]: DEBUG oslo_concurrency.lockutils [None req-39398c22-bea1-450a-80fe-4d59fe3c23ba tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] Acquiring lock "b566340c-f89b-43e3-afae-83b67d4b169c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.297198] env[67820]: DEBUG oslo_concurrency.lockutils [None req-39398c22-bea1-450a-80fe-4d59fe3c23ba tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] Lock "b566340c-f89b-43e3-afae-83b67d4b169c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.323336] env[67820]: DEBUG oslo_concurrency.lockutils [None req-3f22e63e-365c-49c5-87d3-8ddc7c25112c tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] Acquiring lock "87d3ff6f-df3e-4fbe-98b3-98878945da63" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.323605] env[67820]: DEBUG oslo_concurrency.lockutils [None req-3f22e63e-365c-49c5-87d3-8ddc7c25112c tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] Lock "87d3ff6f-df3e-4fbe-98b3-98878945da63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 602.423525] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a7e1b9e3-e593-4d7d-ab46-8dab1e3e85f8 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] Acquiring lock "6c978d86-e29b-4fd6-99e6-1ac37678871d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 602.423748] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a7e1b9e3-e593-4d7d-ab46-8dab1e3e85f8 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] Lock "6c978d86-e29b-4fd6-99e6-1ac37678871d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.370503] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d987cfa7-92a5-450c-b419-beb7ddb9fb93 tempest-ImagesOneServerTestJSON-1009029765 tempest-ImagesOneServerTestJSON-1009029765-project-member] Acquiring lock "08b4411d-6ed5-453a-92f2-1dd0b1ee2140" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.370813] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d987cfa7-92a5-450c-b419-beb7ddb9fb93 tempest-ImagesOneServerTestJSON-1009029765 tempest-ImagesOneServerTestJSON-1009029765-project-member] Lock "08b4411d-6ed5-453a-92f2-1dd0b1ee2140" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.490728] env[67820]: WARNING oslo_vmware.rw_handles [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 613.490728] env[67820]: ERROR oslo_vmware.rw_handles [ 613.491942] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 613.493224] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 613.493479] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Copying Virtual Disk [datastore1] vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/0fd96d9b-4d01-4e03-916a-5ba8318745df/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 613.494038] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1e4ebe58-6123-4bd8-a321-d5327c1a8e15 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.503171] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Waiting for the task: (returnval){ [ 613.503171] env[67820]: value = "task-3467303" [ 613.503171] env[67820]: _type = "Task" [ 613.503171] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 613.523021] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Task: {'id': task-3467303, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 614.018631] env[67820]: DEBUG oslo_vmware.exceptions [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 614.019010] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 614.023098] env[67820]: ERROR nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 614.023098] env[67820]: Faults: ['InvalidArgument'] [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Traceback (most recent call last): [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] yield resources [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self.driver.spawn(context, instance, image_meta, [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self._fetch_image_if_missing(context, vi) [ 614.023098] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] image_cache(vi, tmp_image_ds_loc) [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] vm_util.copy_virtual_disk( [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] session._wait_for_task(vmdk_copy_task) [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] return self.wait_for_task(task_ref) [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] return evt.wait() [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] result = hub.switch() [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 614.023604] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] return self.greenlet.switch() [ 614.024048] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 614.024048] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self.f(*self.args, **self.kw) [ 614.024048] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 614.024048] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] raise exceptions.translate_fault(task_info.error) [ 614.024048] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 614.024048] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Faults: ['InvalidArgument'] [ 614.024048] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] [ 614.024048] env[67820]: INFO nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Terminating instance [ 614.025315] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 614.025583] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 614.026210] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquiring lock "refresh_cache-11335aca-6576-4d27-b50b-95ecc0c157a1" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 614.026409] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquired lock "refresh_cache-11335aca-6576-4d27-b50b-95ecc0c157a1" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 614.026632] env[67820]: DEBUG nova.network.neutron [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 614.027660] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ef146471-48cf-435b-b446-dcefc6020356 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.035703] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 614.035900] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 614.036795] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-adea66be-8406-4a09-9679-090ca59cd889 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.046897] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for the task: (returnval){ [ 614.046897] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5200cc25-0e27-d98a-a4a0-c85937db276f" [ 614.046897] env[67820]: _type = "Task" [ 614.046897] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 614.055051] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5200cc25-0e27-d98a-a4a0-c85937db276f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 614.088433] env[67820]: DEBUG nova.network.neutron [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 614.219896] env[67820]: DEBUG nova.network.neutron [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.236735] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Releasing lock "refresh_cache-11335aca-6576-4d27-b50b-95ecc0c157a1" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 614.237252] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 614.237497] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 614.240198] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-732bc329-e93e-4979-91d7-eaf86b5bd44e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.252858] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 614.252858] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-76ee0a8e-443a-4b5e-9cb1-f6c33eee6e8e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.288391] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 614.291025] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 614.291025] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Deleting the datastore file [datastore1] 11335aca-6576-4d27-b50b-95ecc0c157a1 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 614.291025] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-76828f8e-c01c-489d-9189-fde421547716 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.298251] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Waiting for the task: (returnval){ [ 614.298251] env[67820]: value = "task-3467305" [ 614.298251] env[67820]: _type = "Task" [ 614.298251] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 614.311107] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Task: {'id': task-3467305, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 614.564442] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 614.564442] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Creating directory with path [datastore1] vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 614.564442] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cc1e2bdf-ca6a-4422-80ee-cec47614707b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.589048] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Created directory with path [datastore1] vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 614.589048] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Fetch image to [datastore1] vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 614.589394] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 614.590124] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c16efcd-b608-4f3c-b286-a106a6ed6e71 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.598118] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-054616f6-2fef-4020-8edf-5ecaa050b7b4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.612710] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4f5e04e-f266-40e7-91d9-015a5d876b42 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.654799] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7ceebb8-b329-4026-b853-6ba69e46bbf2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.664223] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-928eb820-f7e3-4504-b6a4-c68d1712de07 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 614.687347] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 614.762163] env[67820]: DEBUG oslo_vmware.rw_handles [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 614.840808] env[67820]: DEBUG oslo_vmware.rw_handles [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 614.841674] env[67820]: DEBUG oslo_vmware.rw_handles [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 614.847056] env[67820]: DEBUG oslo_vmware.api [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Task: {'id': task-3467305, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.056475} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 614.848338] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 614.848338] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 614.848338] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 614.849081] env[67820]: INFO nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Took 0.61 seconds to destroy the instance on the hypervisor. [ 614.849533] env[67820]: DEBUG oslo.service.loopingcall [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 614.849626] env[67820]: DEBUG nova.compute.manager [-] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Skipping network deallocation for instance since networking was not requested. {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 614.852152] env[67820]: DEBUG nova.compute.claims [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 614.852330] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 614.852576] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 615.374900] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5551a5d-8c52-40fa-9e7c-d7c96039982f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.386087] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31e36ee4-fe00-44f0-9ac1-68599ee435db {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.420537] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20fc3886-dcaf-4356-a722-a7e11ed07cf8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.428707] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e946aed3-b3a5-4618-b8d6-a88063c15d3c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 615.443082] env[67820]: DEBUG nova.compute.provider_tree [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 615.455159] env[67820]: DEBUG nova.scheduler.client.report [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 615.479910] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.627s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 615.480457] env[67820]: ERROR nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 615.480457] env[67820]: Faults: ['InvalidArgument'] [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Traceback (most recent call last): [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self.driver.spawn(context, instance, image_meta, [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self._fetch_image_if_missing(context, vi) [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] image_cache(vi, tmp_image_ds_loc) [ 615.480457] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] vm_util.copy_virtual_disk( [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] session._wait_for_task(vmdk_copy_task) [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] return self.wait_for_task(task_ref) [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] return evt.wait() [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] result = hub.switch() [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] return self.greenlet.switch() [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 615.480854] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] self.f(*self.args, **self.kw) [ 615.481266] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 615.481266] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] raise exceptions.translate_fault(task_info.error) [ 615.481266] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 615.481266] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Faults: ['InvalidArgument'] [ 615.481266] env[67820]: ERROR nova.compute.manager [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] [ 615.481266] env[67820]: DEBUG nova.compute.utils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 615.485387] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Build of instance 11335aca-6576-4d27-b50b-95ecc0c157a1 was re-scheduled: A specified parameter was not correct: fileType [ 615.485387] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 615.485999] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 615.486263] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquiring lock "refresh_cache-11335aca-6576-4d27-b50b-95ecc0c157a1" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 615.486427] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Acquired lock "refresh_cache-11335aca-6576-4d27-b50b-95ecc0c157a1" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 615.486587] env[67820]: DEBUG nova.network.neutron [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 615.550308] env[67820]: DEBUG nova.network.neutron [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 615.953835] env[67820]: DEBUG nova.network.neutron [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.970764] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Releasing lock "refresh_cache-11335aca-6576-4d27-b50b-95ecc0c157a1" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 615.971261] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 615.971261] env[67820]: DEBUG nova.compute.manager [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] [instance: 11335aca-6576-4d27-b50b-95ecc0c157a1] Skipping network deallocation for instance since networking was not requested. {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 616.127206] env[67820]: INFO nova.scheduler.client.report [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Deleted allocations for instance 11335aca-6576-4d27-b50b-95ecc0c157a1 [ 616.154185] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0f30641b-0e6d-4669-a14c-948dc37d6625 tempest-ServerDiagnosticsV248Test-981105951 tempest-ServerDiagnosticsV248Test-981105951-project-member] Lock "11335aca-6576-4d27-b50b-95ecc0c157a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.474s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 616.198837] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 616.278015] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 616.278322] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 616.279825] env[67820]: INFO nova.compute.claims [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 616.711199] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-241e033e-aecb-487e-86ba-ee6fdc786714 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.719853] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb77ca2-57b5-4ff9-a487-93d4a85974bf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.756988] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b11bffd-8a92-4f3b-81c6-0d0cbc4a5c6c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.764411] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27030725-99dd-440c-b209-8a3e7fd93673 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.778922] env[67820]: DEBUG nova.compute.provider_tree [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 616.787670] env[67820]: DEBUG nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 616.807021] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.527s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 616.807021] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 616.840857] env[67820]: DEBUG nova.compute.utils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 616.845021] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 616.845021] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 616.856892] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 616.933271] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 616.959162] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 616.959162] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 616.959162] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 616.959735] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 616.959735] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 616.959735] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 616.959843] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 616.960561] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 616.962380] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 616.962501] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 616.962678] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 616.964066] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66aa3711-c335-4fb2-907e-0dc6a935aec4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 616.972723] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55c96b28-1370-4d70-968f-15c298a9b08d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 617.060038] env[67820]: DEBUG nova.policy [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2aa367ffec7d4b3caa69171ba56159b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c89a428d22540a29a3801ff4639145c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 617.943154] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Successfully created port: f77a9c00-d10b-4801-a270-73dee663e85e {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 619.875027] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Successfully updated port: f77a9c00-d10b-4801-a270-73dee663e85e {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 619.894439] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "refresh_cache-939b4cbc-aa1c-4995-9675-4c3d1f4dce55" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 619.894726] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "refresh_cache-939b4cbc-aa1c-4995-9675-4c3d1f4dce55" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 619.894801] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 620.037734] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 620.758405] env[67820]: DEBUG nova.compute.manager [req-29510981-6927-44d7-aa9f-6eb14977a972 req-6e81bcfc-67e2-4ab5-9233-7a77878b067e service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Received event network-vif-plugged-f77a9c00-d10b-4801-a270-73dee663e85e {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 620.758665] env[67820]: DEBUG oslo_concurrency.lockutils [req-29510981-6927-44d7-aa9f-6eb14977a972 req-6e81bcfc-67e2-4ab5-9233-7a77878b067e service nova] Acquiring lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 620.761027] env[67820]: DEBUG oslo_concurrency.lockutils [req-29510981-6927-44d7-aa9f-6eb14977a972 req-6e81bcfc-67e2-4ab5-9233-7a77878b067e service nova] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 620.761027] env[67820]: DEBUG oslo_concurrency.lockutils [req-29510981-6927-44d7-aa9f-6eb14977a972 req-6e81bcfc-67e2-4ab5-9233-7a77878b067e service nova] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 620.761027] env[67820]: DEBUG nova.compute.manager [req-29510981-6927-44d7-aa9f-6eb14977a972 req-6e81bcfc-67e2-4ab5-9233-7a77878b067e service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] No waiting events found dispatching network-vif-plugged-f77a9c00-d10b-4801-a270-73dee663e85e {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 620.761027] env[67820]: WARNING nova.compute.manager [req-29510981-6927-44d7-aa9f-6eb14977a972 req-6e81bcfc-67e2-4ab5-9233-7a77878b067e service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Received unexpected event network-vif-plugged-f77a9c00-d10b-4801-a270-73dee663e85e for instance with vm_state building and task_state spawning. [ 620.780472] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Updating instance_info_cache with network_info: [{"id": "f77a9c00-d10b-4801-a270-73dee663e85e", "address": "fa:16:3e:46:e7:a6", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf77a9c00-d1", "ovs_interfaceid": "f77a9c00-d10b-4801-a270-73dee663e85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.799639] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "refresh_cache-939b4cbc-aa1c-4995-9675-4c3d1f4dce55" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 620.799963] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Instance network_info: |[{"id": "f77a9c00-d10b-4801-a270-73dee663e85e", "address": "fa:16:3e:46:e7:a6", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf77a9c00-d1", "ovs_interfaceid": "f77a9c00-d10b-4801-a270-73dee663e85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 620.800376] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:e7:a6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a10c88d7-d13f-44fd-acee-7a734eb5f56a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f77a9c00-d10b-4801-a270-73dee663e85e', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 620.810782] env[67820]: DEBUG oslo.service.loopingcall [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 620.812810] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 620.813039] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7dbc607d-ec60-4207-9db3-53f7bb759284 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.841711] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 620.841711] env[67820]: value = "task-3467309" [ 620.841711] env[67820]: _type = "Task" [ 620.841711] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 620.853900] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467309, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 621.360887] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467309, 'name': CreateVM_Task, 'duration_secs': 0.287672} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 621.361173] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 621.362503] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 621.362503] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 621.362503] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 621.362503] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-16a8319e-4be1-43ea-9da2-0ed4542f8e9c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.368515] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 621.368515] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]528b6d78-78d5-99df-798c-a5ce6b6ee17e" [ 621.368515] env[67820]: _type = "Task" [ 621.368515] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 621.377371] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]528b6d78-78d5-99df-798c-a5ce6b6ee17e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 621.879598] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 621.879850] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 621.880124] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 622.136888] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "906c40dd-b6d6-492a-aa51-58901959a60d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 622.137963] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "906c40dd-b6d6-492a-aa51-58901959a60d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 622.869488] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0fed6338-d74f-4e45-a2b7-f7e9c752db04 tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] Acquiring lock "5feadca8-33f0-4dac-8e1d-162c77919c77" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 622.869764] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0fed6338-d74f-4e45-a2b7-f7e9c752db04 tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] Lock "5feadca8-33f0-4dac-8e1d-162c77919c77" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 624.410686] env[67820]: DEBUG nova.compute.manager [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Received event network-changed-f77a9c00-d10b-4801-a270-73dee663e85e {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 624.410994] env[67820]: DEBUG nova.compute.manager [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Refreshing instance network info cache due to event network-changed-f77a9c00-d10b-4801-a270-73dee663e85e. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 624.411091] env[67820]: DEBUG oslo_concurrency.lockutils [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] Acquiring lock "refresh_cache-939b4cbc-aa1c-4995-9675-4c3d1f4dce55" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 624.411334] env[67820]: DEBUG oslo_concurrency.lockutils [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] Acquired lock "refresh_cache-939b4cbc-aa1c-4995-9675-4c3d1f4dce55" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 624.411407] env[67820]: DEBUG nova.network.neutron [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Refreshing network info cache for port f77a9c00-d10b-4801-a270-73dee663e85e {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 624.464653] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51d4a037-8ff1-436a-babe-812ab4df76e5 tempest-ServerRescueTestJSON-1486015266 tempest-ServerRescueTestJSON-1486015266-project-member] Acquiring lock "2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 624.465176] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51d4a037-8ff1-436a-babe-812ab4df76e5 tempest-ServerRescueTestJSON-1486015266 tempest-ServerRescueTestJSON-1486015266-project-member] Lock "2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 625.286028] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f301528a-ae39-4cbc-9d5a-f03a736f97fc tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] Acquiring lock "da533036-7e32-4078-9060-6ee7680cba5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 625.286028] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f301528a-ae39-4cbc-9d5a-f03a736f97fc tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] Lock "da533036-7e32-4078-9060-6ee7680cba5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 625.349444] env[67820]: DEBUG nova.network.neutron [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Updated VIF entry in instance network info cache for port f77a9c00-d10b-4801-a270-73dee663e85e. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 625.350319] env[67820]: DEBUG nova.network.neutron [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Updating instance_info_cache with network_info: [{"id": "f77a9c00-d10b-4801-a270-73dee663e85e", "address": "fa:16:3e:46:e7:a6", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf77a9c00-d1", "ovs_interfaceid": "f77a9c00-d10b-4801-a270-73dee663e85e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.371677] env[67820]: DEBUG oslo_concurrency.lockutils [req-d853b764-aadd-4ee4-a8ec-93e40a289440 req-6eefca72-63d6-4bff-9243-e6cbdf8f836b service nova] Releasing lock "refresh_cache-939b4cbc-aa1c-4995-9675-4c3d1f4dce55" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 629.715340] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b752fbe6-46ca-44b6-b052-b77aa50af329 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454-project-member] Acquiring lock "4e356219-fa88-474c-97fe-6f6a6ef0c90d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 629.715637] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b752fbe6-46ca-44b6-b052-b77aa50af329 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454-project-member] Lock "4e356219-fa88-474c-97fe-6f6a6ef0c90d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 630.476595] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e207ec3d-d65a-4446-9f8f-4d954dd4a000 tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] Acquiring lock "454a0392-c614-4ca8-903e-48efa44be22f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 630.476812] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e207ec3d-d65a-4446-9f8f-4d954dd4a000 tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] Lock "454a0392-c614-4ca8-903e-48efa44be22f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.456095] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d90fef5e-1cf7-4482-bbf5-b798e9a0355a tempest-ServerDiagnosticsTest-1880289301 tempest-ServerDiagnosticsTest-1880289301-project-member] Acquiring lock "90319a68-5613-4b18-91d3-b606d258ced9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.456419] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d90fef5e-1cf7-4482-bbf5-b798e9a0355a tempest-ServerDiagnosticsTest-1880289301 tempest-ServerDiagnosticsTest-1880289301-project-member] Lock "90319a68-5613-4b18-91d3-b606d258ced9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 632.144566] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8fa02df1-30a7-4ecb-9dac-48e772013f76 tempest-ServersTestBootFromVolume-725629375 tempest-ServersTestBootFromVolume-725629375-project-member] Acquiring lock "b80e4804-6f23-4059-8e9c-bf8ecdc2efc2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 632.144566] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8fa02df1-30a7-4ecb-9dac-48e772013f76 tempest-ServersTestBootFromVolume-725629375 tempest-ServersTestBootFromVolume-725629375-project-member] Lock "b80e4804-6f23-4059-8e9c-bf8ecdc2efc2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.220000] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f5d236fa-aa9c-424f-be60-c88fc95baa29 tempest-AttachInterfacesUnderV243Test-989850202 tempest-AttachInterfacesUnderV243Test-989850202-project-member] Acquiring lock "8e49f35a-037b-4db4-8bec-005b50905852" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.221075] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f5d236fa-aa9c-424f-be60-c88fc95baa29 tempest-AttachInterfacesUnderV243Test-989850202 tempest-AttachInterfacesUnderV243Test-989850202-project-member] Lock "8e49f35a-037b-4db4-8bec-005b50905852" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 636.182481] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.207216] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.207420] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.208027] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.621540] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 636.621776] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 637.621367] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 637.621608] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 637.621673] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 637.643336] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.643504] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.643633] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.643757] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.643880] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.644015] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.645049] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.645049] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.645049] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.645049] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 637.645049] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 637.645275] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 637.645862] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 637.645862] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 637.645862] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 637.657940] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.658170] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.658335] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 637.658483] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 637.659576] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ae0085f-4eba-4bce-8ed3-486d9cb78286 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.669352] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-295b242d-9ce0-4600-9b07-1f9affe4f6a3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.683267] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd3fe12b-6355-4441-b001-e94f14afb54e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.690253] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9155a0d-b622-4250-bbfe-6ee20a13922a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.720029] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180879MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 637.720029] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.720029] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.790981] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d08e77ae-af85-4dfe-86e7-60f850369485 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.791167] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ca9f607c-cde9-459e-aa9e-4b060bc8a68b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.791298] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.791963] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 14478951-d2c1-4472-af0c-354757e0bb0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.791963] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance be35d888-f649-44e4-af23-341b8bfc81f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.791963] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 573a28e3-bfc4-4b08-919b-65acbca79c7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.791963] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1cc3b207-a628-4fe5-8908-6879483806b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.792227] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 80f480dc-9bb8-4764-9b6b-793c0954962e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.792227] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.792227] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 637.832351] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.856311] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.868992] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.879456] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8909b0c9-f236-4d22-b3f1-3bf15b82aa0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.889783] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ed58f91c-e284-4877-8d7c-f1f9e9f1add8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.901606] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ed56b60-f2a0-4ada-ace0-3b85d93693f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.912211] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 7fc2bae2-34a4-472a-a097-f93245de32bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.923914] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 18d924ce-620d-45d1-92cf-3f8cfa5a81b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.934020] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 66add717-36cf-4328-8d8c-32beb0eca333 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.958138] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5a22e980-f0c4-4fca-a4bf-e5c1347e9b80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.969155] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 91d81e24-4166-4135-9a3b-fe117bdf9c2d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.979813] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d41bef54-918d-4a67-9666-4beb8bd1d1dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 637.990383] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance b566340c-f89b-43e3-afae-83b67d4b169c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.000019] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 87d3ff6f-df3e-4fbe-98b3-98878945da63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.009641] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 6c978d86-e29b-4fd6-99e6-1ac37678871d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.019745] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 08b4411d-6ed5-453a-92f2-1dd0b1ee2140 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.029234] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.038421] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5feadca8-33f0-4dac-8e1d-162c77919c77 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.047138] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.056521] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance da533036-7e32-4078-9060-6ee7680cba5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.065444] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4e356219-fa88-474c-97fe-6f6a6ef0c90d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.074258] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 454a0392-c614-4ca8-903e-48efa44be22f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.083224] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 90319a68-5613-4b18-91d3-b606d258ced9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.092850] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance b80e4804-6f23-4059-8e9c-bf8ecdc2efc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.102978] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8e49f35a-037b-4db4-8bec-005b50905852 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 638.103724] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 638.103724] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 638.496246] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ae081dc-6072-4a6a-b221-f5f1b6419f4b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.504731] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f58c33a9-1e41-4967-8969-4112f0566b10 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.535057] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-806ec0e1-19a8-45a6-9758-2a61eff175e8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.542322] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56f48d35-1824-492d-8e7a-dfeaf01be0f8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.555323] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 638.563395] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 638.577946] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 638.577946] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.858s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 663.511033] env[67820]: WARNING oslo_vmware.rw_handles [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 663.511033] env[67820]: ERROR oslo_vmware.rw_handles [ 663.511720] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 663.513076] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 663.513355] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Copying Virtual Disk [datastore1] vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/6e180892-cf68-485a-9715-9ca83a2ad117/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 663.513696] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0bacb365-bfbd-456b-9f7b-7a32c78bcd25 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 663.521730] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for the task: (returnval){ [ 663.521730] env[67820]: value = "task-3467314" [ 663.521730] env[67820]: _type = "Task" [ 663.521730] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 663.531403] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': task-3467314, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 664.034168] env[67820]: DEBUG oslo_vmware.exceptions [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 664.034168] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 664.034168] env[67820]: ERROR nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 664.034168] env[67820]: Faults: ['InvalidArgument'] [ 664.034168] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Traceback (most recent call last): [ 664.034168] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 664.034168] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] yield resources [ 664.034168] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 664.034168] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self.driver.spawn(context, instance, image_meta, [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self._fetch_image_if_missing(context, vi) [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] image_cache(vi, tmp_image_ds_loc) [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] vm_util.copy_virtual_disk( [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] session._wait_for_task(vmdk_copy_task) [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] return self.wait_for_task(task_ref) [ 664.034506] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] return evt.wait() [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] result = hub.switch() [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] return self.greenlet.switch() [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self.f(*self.args, **self.kw) [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] raise exceptions.translate_fault(task_info.error) [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Faults: ['InvalidArgument'] [ 664.034871] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] [ 664.035238] env[67820]: INFO nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Terminating instance [ 664.035652] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 664.035854] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 664.036190] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5bf9dd94-fd1a-482c-a330-2cbf382059fc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.038398] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 664.038586] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 664.039338] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed69a662-7be4-4ea0-9ce4-f8251b9f5040 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.046214] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 664.046437] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e252953e-70ab-414e-9a35-f22a2d588e7c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.048653] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 664.048829] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 664.049819] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-89cd7306-b250-4299-91fc-c90b58e6678d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.054680] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 664.054680] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d05889-82d9-4257-c078-d96714a5fb67" [ 664.054680] env[67820]: _type = "Task" [ 664.054680] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 664.062392] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d05889-82d9-4257-c078-d96714a5fb67, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 664.112238] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 664.112474] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 664.112657] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Deleting the datastore file [datastore1] ca9f607c-cde9-459e-aa9e-4b060bc8a68b {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 664.112935] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-16091b30-fb70-4dee-95ff-634b7f1b2c0d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.119381] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for the task: (returnval){ [ 664.119381] env[67820]: value = "task-3467316" [ 664.119381] env[67820]: _type = "Task" [ 664.119381] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 664.126821] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': task-3467316, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 664.564502] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 664.564754] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 664.564987] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2dd1d907-bd5c-4d05-8f05-ce0a47cf7e32 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.576583] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 664.576822] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Fetch image to [datastore1] vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 664.577049] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 664.577882] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c22021c-54bd-4888-ac2f-16054d44acb6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.584984] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-590a3005-f48e-4045-8320-4912404aa91a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.595255] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cb669ad-7983-4dfe-af2c-764763cdd5e8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.629091] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2a1e6e0-b46a-46a2-894b-60974b66b2ee {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.636365] env[67820]: DEBUG oslo_vmware.api [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': task-3467316, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073045} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 664.637795] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 664.638024] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 664.638215] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 664.638388] env[67820]: INFO nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 664.640142] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fe3fb4ac-59c0-449f-857f-a8435ee11630 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 664.642107] env[67820]: DEBUG nova.compute.claims [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 664.642280] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 664.642512] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.674120] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 664.726929] env[67820]: DEBUG oslo_vmware.rw_handles [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 664.789429] env[67820]: DEBUG oslo_vmware.rw_handles [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 664.789637] env[67820]: DEBUG oslo_vmware.rw_handles [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 665.138021] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7abb1a5c-99c2-4409-8627-8ce67ddd6b01 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.145208] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57701495-77cf-46a8-82c2-0cdd002033b7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.176203] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd9b0c28-e88e-4367-9250-e78e901a4f16 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.183522] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8659a1ad-844d-49ab-ad41-0898da77c032 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 665.196378] env[67820]: DEBUG nova.compute.provider_tree [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 665.204768] env[67820]: DEBUG nova.scheduler.client.report [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 665.220496] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.578s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 665.221044] env[67820]: ERROR nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.221044] env[67820]: Faults: ['InvalidArgument'] [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Traceback (most recent call last): [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self.driver.spawn(context, instance, image_meta, [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self._fetch_image_if_missing(context, vi) [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] image_cache(vi, tmp_image_ds_loc) [ 665.221044] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] vm_util.copy_virtual_disk( [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] session._wait_for_task(vmdk_copy_task) [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] return self.wait_for_task(task_ref) [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] return evt.wait() [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] result = hub.switch() [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] return self.greenlet.switch() [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 665.221362] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] self.f(*self.args, **self.kw) [ 665.221669] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 665.221669] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] raise exceptions.translate_fault(task_info.error) [ 665.221669] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 665.221669] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Faults: ['InvalidArgument'] [ 665.221669] env[67820]: ERROR nova.compute.manager [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] [ 665.221773] env[67820]: DEBUG nova.compute.utils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 665.223408] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Build of instance ca9f607c-cde9-459e-aa9e-4b060bc8a68b was re-scheduled: A specified parameter was not correct: fileType [ 665.223408] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 665.223785] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 665.223926] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 665.224093] env[67820]: DEBUG nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 665.224257] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 665.628692] env[67820]: DEBUG nova.network.neutron [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 665.639556] env[67820]: INFO nova.compute.manager [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: ca9f607c-cde9-459e-aa9e-4b060bc8a68b] Took 0.42 seconds to deallocate network for instance. [ 665.750065] env[67820]: INFO nova.scheduler.client.report [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Deleted allocations for instance ca9f607c-cde9-459e-aa9e-4b060bc8a68b [ 665.772747] env[67820]: DEBUG oslo_concurrency.lockutils [None req-103b0129-b3f8-4448-a9cb-49a4f827441f tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "ca9f607c-cde9-459e-aa9e-4b060bc8a68b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 102.788s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 665.787157] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 665.858222] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.858542] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.860234] env[67820]: INFO nova.compute.claims [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 666.304966] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7965b23b-0c0d-46cc-ad5b-175140e0994e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.313058] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5da2fd92-aedd-4305-8e21-178327f8fce8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.343661] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a02f2a61-5312-4ad1-bf49-ac2a4b276dbd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.351427] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-846808b9-142a-43b0-a263-8e04db0cc067 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.364725] env[67820]: DEBUG nova.compute.provider_tree [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 666.373754] env[67820]: DEBUG nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 666.387056] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.528s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.387376] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 666.424180] env[67820]: DEBUG nova.compute.utils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 666.425405] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 666.425579] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 666.439028] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 666.502298] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 666.505511] env[67820]: DEBUG nova.policy [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2aa367ffec7d4b3caa69171ba56159b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7c89a428d22540a29a3801ff4639145c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 666.534149] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 666.534149] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 666.534149] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 666.534284] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 666.534284] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 666.534284] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 666.534284] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 666.534284] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 666.534418] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 666.534418] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 666.534418] env[67820]: DEBUG nova.virt.hardware [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 666.535335] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b48a6452-830c-463d-915c-015ac9e53b98 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.543705] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-972b6f32-b0b3-40a4-908e-c8847d5de5b1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.893152] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Successfully created port: 5413a32b-6a15-49bd-bc4b-1c2dc26cec7e {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 667.666805] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Successfully updated port: 5413a32b-6a15-49bd-bc4b-1c2dc26cec7e {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 667.691446] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "refresh_cache-d5c358d0-46f8-4cba-9e37-34c5dfe92526" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 667.691640] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "refresh_cache-d5c358d0-46f8-4cba-9e37-34c5dfe92526" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 667.691797] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 667.745088] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 667.900422] env[67820]: DEBUG nova.compute.manager [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Received event network-vif-plugged-5413a32b-6a15-49bd-bc4b-1c2dc26cec7e {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 667.900683] env[67820]: DEBUG oslo_concurrency.lockutils [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] Acquiring lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 667.900922] env[67820]: DEBUG oslo_concurrency.lockutils [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 667.901053] env[67820]: DEBUG oslo_concurrency.lockutils [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 667.901216] env[67820]: DEBUG nova.compute.manager [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] No waiting events found dispatching network-vif-plugged-5413a32b-6a15-49bd-bc4b-1c2dc26cec7e {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 667.901378] env[67820]: WARNING nova.compute.manager [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Received unexpected event network-vif-plugged-5413a32b-6a15-49bd-bc4b-1c2dc26cec7e for instance with vm_state building and task_state spawning. [ 667.901533] env[67820]: DEBUG nova.compute.manager [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Received event network-changed-5413a32b-6a15-49bd-bc4b-1c2dc26cec7e {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 667.901684] env[67820]: DEBUG nova.compute.manager [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Refreshing instance network info cache due to event network-changed-5413a32b-6a15-49bd-bc4b-1c2dc26cec7e. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 667.901844] env[67820]: DEBUG oslo_concurrency.lockutils [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] Acquiring lock "refresh_cache-d5c358d0-46f8-4cba-9e37-34c5dfe92526" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 668.157019] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Updating instance_info_cache with network_info: [{"id": "5413a32b-6a15-49bd-bc4b-1c2dc26cec7e", "address": "fa:16:3e:69:ca:87", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5413a32b-6a", "ovs_interfaceid": "5413a32b-6a15-49bd-bc4b-1c2dc26cec7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 668.169364] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "refresh_cache-d5c358d0-46f8-4cba-9e37-34c5dfe92526" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 668.169651] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Instance network_info: |[{"id": "5413a32b-6a15-49bd-bc4b-1c2dc26cec7e", "address": "fa:16:3e:69:ca:87", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5413a32b-6a", "ovs_interfaceid": "5413a32b-6a15-49bd-bc4b-1c2dc26cec7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 668.170679] env[67820]: DEBUG oslo_concurrency.lockutils [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] Acquired lock "refresh_cache-d5c358d0-46f8-4cba-9e37-34c5dfe92526" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 668.170679] env[67820]: DEBUG nova.network.neutron [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Refreshing network info cache for port 5413a32b-6a15-49bd-bc4b-1c2dc26cec7e {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 668.171406] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:69:ca:87', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a10c88d7-d13f-44fd-acee-7a734eb5f56a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5413a32b-6a15-49bd-bc4b-1c2dc26cec7e', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 668.179134] env[67820]: DEBUG oslo.service.loopingcall [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 668.181223] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 668.181905] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b3ddf337-ae50-4f61-ad91-3354c1f43c19 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.206499] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 668.206499] env[67820]: value = "task-3467317" [ 668.206499] env[67820]: _type = "Task" [ 668.206499] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 668.214809] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467317, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 668.220706] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 668.220907] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 668.525439] env[67820]: DEBUG nova.network.neutron [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Updated VIF entry in instance network info cache for port 5413a32b-6a15-49bd-bc4b-1c2dc26cec7e. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 668.525793] env[67820]: DEBUG nova.network.neutron [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Updating instance_info_cache with network_info: [{"id": "5413a32b-6a15-49bd-bc4b-1c2dc26cec7e", "address": "fa:16:3e:69:ca:87", "network": {"id": "e042d556-88c5-4c0c-9939-9c0922d1679c", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-801438597-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7c89a428d22540a29a3801ff4639145c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a10c88d7-d13f-44fd-acee-7a734eb5f56a", "external-id": "nsx-vlan-transportzone-766", "segmentation_id": 766, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5413a32b-6a", "ovs_interfaceid": "5413a32b-6a15-49bd-bc4b-1c2dc26cec7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 668.536028] env[67820]: DEBUG oslo_concurrency.lockutils [req-8a5ce5dc-ebd4-402b-a9a7-ecd62589552a req-9f631113-b221-4f5f-a1cc-55ba09dd4625 service nova] Releasing lock "refresh_cache-d5c358d0-46f8-4cba-9e37-34c5dfe92526" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 668.716522] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467317, 'name': CreateVM_Task} progress is 10%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 669.217155] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467317, 'name': CreateVM_Task, 'duration_secs': 0.783413} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 669.217412] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 669.217976] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 669.218181] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 669.218781] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 669.218858] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f401e83-9880-45ef-8881-831791d9e4f3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.223118] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 669.223118] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5208e716-7d6f-d820-1af9-00a50a500e04" [ 669.223118] env[67820]: _type = "Task" [ 669.223118] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 669.230471] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5208e716-7d6f-d820-1af9-00a50a500e04, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 669.733603] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 669.733922] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 669.734155] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 696.554383] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 696.554754] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 696.621849] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 696.621849] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.622124] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.622406] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 697.622460] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 697.643576] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.643690] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.643782] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644522] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644522] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644522] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644522] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644522] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644782] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644782] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 697.644782] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 697.645233] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 697.645380] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 698.620745] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 698.621035] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 698.632753] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 698.633019] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 698.633150] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 698.633301] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 698.634471] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbb3dc13-39a9-4b89-97ea-3cf011bff0f8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.643282] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80b2f972-f751-454b-8ed6-df842bf0c2f9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.658225] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe4c19a7-9ebd-432a-a124-8e0d484bcc3a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.664636] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fba6a960-bc08-4c7f-83b8-6a567b9e4666 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 698.693558] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180914MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 698.693750] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 698.693961] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 698.772456] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d08e77ae-af85-4dfe-86e7-60f850369485 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.772602] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.772730] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 14478951-d2c1-4472-af0c-354757e0bb0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.772852] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance be35d888-f649-44e4-af23-341b8bfc81f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.772970] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 573a28e3-bfc4-4b08-919b-65acbca79c7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.773102] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1cc3b207-a628-4fe5-8908-6879483806b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.773219] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 80f480dc-9bb8-4764-9b6b-793c0954962e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.773334] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.773447] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.773558] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 698.784937] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.797021] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.806246] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8909b0c9-f236-4d22-b3f1-3bf15b82aa0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.816314] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ed58f91c-e284-4877-8d7c-f1f9e9f1add8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.826076] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ed56b60-f2a0-4ada-ace0-3b85d93693f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.836764] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 7fc2bae2-34a4-472a-a097-f93245de32bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.846780] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 18d924ce-620d-45d1-92cf-3f8cfa5a81b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.856957] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 66add717-36cf-4328-8d8c-32beb0eca333 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.865385] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5a22e980-f0c4-4fca-a4bf-e5c1347e9b80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.874831] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 91d81e24-4166-4135-9a3b-fe117bdf9c2d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.883679] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d41bef54-918d-4a67-9666-4beb8bd1d1dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.894931] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance b566340c-f89b-43e3-afae-83b67d4b169c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.904030] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 87d3ff6f-df3e-4fbe-98b3-98878945da63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.916330] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 6c978d86-e29b-4fd6-99e6-1ac37678871d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.925871] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 08b4411d-6ed5-453a-92f2-1dd0b1ee2140 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.939105] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.948785] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5feadca8-33f0-4dac-8e1d-162c77919c77 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.963098] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.973970] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance da533036-7e32-4078-9060-6ee7680cba5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.984241] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4e356219-fa88-474c-97fe-6f6a6ef0c90d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 698.994679] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 454a0392-c614-4ca8-903e-48efa44be22f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 699.005298] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 90319a68-5613-4b18-91d3-b606d258ced9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 699.015324] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance b80e4804-6f23-4059-8e9c-bf8ecdc2efc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 699.027776] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8e49f35a-037b-4db4-8bec-005b50905852 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 699.037639] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 699.037923] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 699.038106] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 699.472798] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14dfc414-eac7-4e9d-b657-f07539864a4c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.480082] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da6db41b-9f75-4e7d-bf7d-cf87446a7ef9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.509556] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d05bacf4-23e4-4cc9-af7a-8dda0218b8e2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.517012] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f64179f4-77c8-402c-a394-952426d0ba74 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.530042] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 699.541030] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 699.553808] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 699.554502] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.860s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 700.555216] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 710.590907] env[67820]: WARNING oslo_vmware.rw_handles [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 710.590907] env[67820]: ERROR oslo_vmware.rw_handles [ 710.591532] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 710.593047] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 710.593300] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Copying Virtual Disk [datastore1] vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/ebed7d0a-b8d0-4d78-b16a-a8ff60e99d82/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 710.593585] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e85233f1-a93b-4be8-838c-b2eea83e7735 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.601946] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 710.601946] env[67820]: value = "task-3467318" [ 710.601946] env[67820]: _type = "Task" [ 710.601946] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 710.609508] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467318, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.112735] env[67820]: DEBUG oslo_vmware.exceptions [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 711.112735] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 711.113242] env[67820]: ERROR nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 711.113242] env[67820]: Faults: ['InvalidArgument'] [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Traceback (most recent call last): [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] yield resources [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self.driver.spawn(context, instance, image_meta, [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self._fetch_image_if_missing(context, vi) [ 711.113242] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] image_cache(vi, tmp_image_ds_loc) [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] vm_util.copy_virtual_disk( [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] session._wait_for_task(vmdk_copy_task) [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] return self.wait_for_task(task_ref) [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] return evt.wait() [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] result = hub.switch() [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.113528] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] return self.greenlet.switch() [ 711.113837] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 711.113837] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self.f(*self.args, **self.kw) [ 711.113837] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 711.113837] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] raise exceptions.translate_fault(task_info.error) [ 711.113837] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 711.113837] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Faults: ['InvalidArgument'] [ 711.113837] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] [ 711.113837] env[67820]: INFO nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Terminating instance [ 711.115060] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 711.115276] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 711.115891] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 711.116092] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 711.116315] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-42e787b2-3584-49d9-acf6-0b9e499b20bb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.118810] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd3c88a4-9172-4626-9565-2e2d64aed3cd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.125480] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 711.125731] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e9036a4b-520e-41eb-9bef-0a596cc351d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.127901] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 711.128084] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 711.129027] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d21637b-838e-4c12-8f61-0d2d3a2d0d43 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.133737] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Waiting for the task: (returnval){ [ 711.133737] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]526c5775-9c99-b8f6-a810-8e1c331db6da" [ 711.133737] env[67820]: _type = "Task" [ 711.133737] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.140565] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]526c5775-9c99-b8f6-a810-8e1c331db6da, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.206367] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 711.206650] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 711.206993] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleting the datastore file [datastore1] d08e77ae-af85-4dfe-86e7-60f850369485 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 711.207131] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-19bdd76e-fba7-4e12-9cc7-72dbdcc4660a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.213256] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 711.213256] env[67820]: value = "task-3467320" [ 711.213256] env[67820]: _type = "Task" [ 711.213256] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.220888] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467320, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.644065] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 711.644065] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Creating directory with path [datastore1] vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 711.644429] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b43aa248-c382-4998-9a7a-69592c290c79 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.655364] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Created directory with path [datastore1] vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 711.655549] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Fetch image to [datastore1] vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 711.655714] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 711.656458] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8546745d-6ef3-4ea3-9ef3-10299700afa9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.662926] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3555cbf-8dfc-4245-bb7f-56fdcf5904d2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.671414] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f35f6c3a-5f76-4cdd-939c-6427fcfe2ce3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.702077] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-749b5bff-551e-46c2-9015-f7c4c2c81916 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.707458] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7c5ae4c7-3a42-4eb9-864c-947823a0e61c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.721319] env[67820]: DEBUG oslo_vmware.api [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467320, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069804} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 711.721542] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 711.721721] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 711.721886] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 711.722076] env[67820]: INFO nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Took 0.61 seconds to destroy the instance on the hypervisor. [ 711.724169] env[67820]: DEBUG nova.compute.claims [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 711.724341] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 711.724544] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 711.728376] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 711.780194] env[67820]: DEBUG oslo_vmware.rw_handles [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 711.840031] env[67820]: DEBUG oslo_vmware.rw_handles [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 711.840031] env[67820]: DEBUG oslo_vmware.rw_handles [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 712.206042] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29719fb2-b187-42ec-b9d5-5c70a4527fcc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.213348] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d3395a7-9ead-48cb-b452-5a99ac08a589 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.242618] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62a40657-065d-4b2c-9c79-3231f665477c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.250063] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0422848a-a123-4545-8c27-3b1782f9e30b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.262602] env[67820]: DEBUG nova.compute.provider_tree [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.272335] env[67820]: DEBUG nova.scheduler.client.report [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.286151] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.561s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.286689] env[67820]: ERROR nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 712.286689] env[67820]: Faults: ['InvalidArgument'] [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Traceback (most recent call last): [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self.driver.spawn(context, instance, image_meta, [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self._vmops.spawn(context, instance, image_meta, injected_files, [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self._fetch_image_if_missing(context, vi) [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] image_cache(vi, tmp_image_ds_loc) [ 712.286689] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] vm_util.copy_virtual_disk( [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] session._wait_for_task(vmdk_copy_task) [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] return self.wait_for_task(task_ref) [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] return evt.wait() [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] result = hub.switch() [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] return self.greenlet.switch() [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 712.287082] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] self.f(*self.args, **self.kw) [ 712.287398] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 712.287398] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] raise exceptions.translate_fault(task_info.error) [ 712.287398] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 712.287398] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Faults: ['InvalidArgument'] [ 712.287398] env[67820]: ERROR nova.compute.manager [instance: d08e77ae-af85-4dfe-86e7-60f850369485] [ 712.287398] env[67820]: DEBUG nova.compute.utils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 712.288706] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Build of instance d08e77ae-af85-4dfe-86e7-60f850369485 was re-scheduled: A specified parameter was not correct: fileType [ 712.288706] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 712.289080] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 712.289255] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 712.289422] env[67820]: DEBUG nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 712.289649] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 712.676636] env[67820]: DEBUG nova.network.neutron [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.690531] env[67820]: INFO nova.compute.manager [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d08e77ae-af85-4dfe-86e7-60f850369485] Took 0.40 seconds to deallocate network for instance. [ 712.787334] env[67820]: INFO nova.scheduler.client.report [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted allocations for instance d08e77ae-af85-4dfe-86e7-60f850369485 [ 712.815710] env[67820]: DEBUG oslo_concurrency.lockutils [None req-feacce4f-a5e2-4a71-9b6d-5b4b2258dfac tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d08e77ae-af85-4dfe-86e7-60f850369485" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 150.501s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 712.835171] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 712.886063] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 712.886338] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.888349] env[67820]: INFO nova.compute.claims [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.328932] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4caafd88-22a4-4588-bb42-b515013a61eb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.336617] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f66ba7f3-4fc1-438c-ab3b-873bed929be0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.366684] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-518a598e-f305-444c-b4e3-0513d149f188 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.374238] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5740f7c-913f-4732-94bf-c6361d564956 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.390102] env[67820]: DEBUG nova.compute.provider_tree [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.400463] env[67820]: DEBUG nova.scheduler.client.report [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.420326] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.534s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 713.420836] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 713.454313] env[67820]: DEBUG nova.compute.utils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.455787] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 713.455960] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 713.465592] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 713.541098] env[67820]: DEBUG nova.policy [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a98df688e574682bf50408dccf9c8da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b41216bc91264efa9dc3bff8c65ad1e6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 713.542713] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 713.568718] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 713.568973] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 713.569145] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 713.569328] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 713.569476] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 713.569626] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 713.569829] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 713.569983] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 713.570169] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 713.570370] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 713.570498] env[67820]: DEBUG nova.virt.hardware [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 713.571382] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db43f518-3085-4bf3-ae89-4e63477822af {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 713.579759] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89facac7-7648-4c62-b851-86455691decf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 714.200532] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Successfully created port: 0fd9572a-37f4-4e7f-9279-c069874c5ef8 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 714.242017] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.242017] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.975639] env[67820]: DEBUG nova.compute.manager [req-7986353c-3a59-4a8e-a4c8-cdc9bf4c6c5a req-a4e04fe9-bb03-493f-bf11-748f58af1013 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Received event network-vif-plugged-0fd9572a-37f4-4e7f-9279-c069874c5ef8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 714.976303] env[67820]: DEBUG oslo_concurrency.lockutils [req-7986353c-3a59-4a8e-a4c8-cdc9bf4c6c5a req-a4e04fe9-bb03-493f-bf11-748f58af1013 service nova] Acquiring lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 714.977169] env[67820]: DEBUG oslo_concurrency.lockutils [req-7986353c-3a59-4a8e-a4c8-cdc9bf4c6c5a req-a4e04fe9-bb03-493f-bf11-748f58af1013 service nova] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 714.977169] env[67820]: DEBUG oslo_concurrency.lockutils [req-7986353c-3a59-4a8e-a4c8-cdc9bf4c6c5a req-a4e04fe9-bb03-493f-bf11-748f58af1013 service nova] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 714.977278] env[67820]: DEBUG nova.compute.manager [req-7986353c-3a59-4a8e-a4c8-cdc9bf4c6c5a req-a4e04fe9-bb03-493f-bf11-748f58af1013 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] No waiting events found dispatching network-vif-plugged-0fd9572a-37f4-4e7f-9279-c069874c5ef8 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 714.977470] env[67820]: WARNING nova.compute.manager [req-7986353c-3a59-4a8e-a4c8-cdc9bf4c6c5a req-a4e04fe9-bb03-493f-bf11-748f58af1013 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Received unexpected event network-vif-plugged-0fd9572a-37f4-4e7f-9279-c069874c5ef8 for instance with vm_state building and task_state spawning. [ 715.034361] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Successfully updated port: 0fd9572a-37f4-4e7f-9279-c069874c5ef8 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 715.054206] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "refresh_cache-ffe8063c-5dae-4e58-beca-f3a883d5d8df" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 715.054359] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquired lock "refresh_cache-ffe8063c-5dae-4e58-beca-f3a883d5d8df" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 715.054511] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 715.117389] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.364148] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Updating instance_info_cache with network_info: [{"id": "0fd9572a-37f4-4e7f-9279-c069874c5ef8", "address": "fa:16:3e:96:93:01", "network": {"id": "ea4d865d-0bb3-48a1-9c9c-678512db06f1", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2073643026-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b41216bc91264efa9dc3bff8c65ad1e6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4734e5e-2a76-4bda-8905-70c9bf9e007f", "external-id": "nsx-vlan-transportzone-122", "segmentation_id": 122, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fd9572a-37", "ovs_interfaceid": "0fd9572a-37f4-4e7f-9279-c069874c5ef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.381333] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Releasing lock "refresh_cache-ffe8063c-5dae-4e58-beca-f3a883d5d8df" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 715.381333] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Instance network_info: |[{"id": "0fd9572a-37f4-4e7f-9279-c069874c5ef8", "address": "fa:16:3e:96:93:01", "network": {"id": "ea4d865d-0bb3-48a1-9c9c-678512db06f1", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2073643026-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b41216bc91264efa9dc3bff8c65ad1e6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4734e5e-2a76-4bda-8905-70c9bf9e007f", "external-id": "nsx-vlan-transportzone-122", "segmentation_id": 122, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fd9572a-37", "ovs_interfaceid": "0fd9572a-37f4-4e7f-9279-c069874c5ef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 715.381453] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:93:01', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b4734e5e-2a76-4bda-8905-70c9bf9e007f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0fd9572a-37f4-4e7f-9279-c069874c5ef8', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 715.388429] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Creating folder: Project (b41216bc91264efa9dc3bff8c65ad1e6). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 715.388991] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-72717dbe-2542-4624-8c89-f47f8a1a96b6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.400390] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Created folder: Project (b41216bc91264efa9dc3bff8c65ad1e6) in parent group-v692668. [ 715.400578] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Creating folder: Instances. Parent ref: group-v692705. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 715.400801] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c6638000-1862-484e-a951-951e997cdc9b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.410207] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Created folder: Instances in parent group-v692705. [ 715.410437] env[67820]: DEBUG oslo.service.loopingcall [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 715.410614] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 715.410813] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b4210e3-0f59-46bf-81f0-f0176ffe8bdf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.429528] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 715.429528] env[67820]: value = "task-3467323" [ 715.429528] env[67820]: _type = "Task" [ 715.429528] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 715.437369] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467323, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 715.939824] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467323, 'name': CreateVM_Task, 'duration_secs': 0.290042} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 715.940528] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 715.940669] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 715.940833] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 715.941164] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 715.941398] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb2523c3-f8d9-47de-8ef0-260b9f1a48e8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 715.945775] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Waiting for the task: (returnval){ [ 715.945775] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52a25f30-515f-308d-637b-c019819dba03" [ 715.945775] env[67820]: _type = "Task" [ 715.945775] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 715.953176] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52a25f30-515f-308d-637b-c019819dba03, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 716.456237] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 716.456535] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 716.456793] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 717.029943] env[67820]: DEBUG nova.compute.manager [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Received event network-changed-0fd9572a-37f4-4e7f-9279-c069874c5ef8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 717.030012] env[67820]: DEBUG nova.compute.manager [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Refreshing instance network info cache due to event network-changed-0fd9572a-37f4-4e7f-9279-c069874c5ef8. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 717.030188] env[67820]: DEBUG oslo_concurrency.lockutils [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] Acquiring lock "refresh_cache-ffe8063c-5dae-4e58-beca-f3a883d5d8df" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 717.030332] env[67820]: DEBUG oslo_concurrency.lockutils [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] Acquired lock "refresh_cache-ffe8063c-5dae-4e58-beca-f3a883d5d8df" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 717.030493] env[67820]: DEBUG nova.network.neutron [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Refreshing network info cache for port 0fd9572a-37f4-4e7f-9279-c069874c5ef8 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 717.354930] env[67820]: DEBUG nova.network.neutron [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Updated VIF entry in instance network info cache for port 0fd9572a-37f4-4e7f-9279-c069874c5ef8. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 717.355301] env[67820]: DEBUG nova.network.neutron [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Updating instance_info_cache with network_info: [{"id": "0fd9572a-37f4-4e7f-9279-c069874c5ef8", "address": "fa:16:3e:96:93:01", "network": {"id": "ea4d865d-0bb3-48a1-9c9c-678512db06f1", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2073643026-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b41216bc91264efa9dc3bff8c65ad1e6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4734e5e-2a76-4bda-8905-70c9bf9e007f", "external-id": "nsx-vlan-transportzone-122", "segmentation_id": 122, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fd9572a-37", "ovs_interfaceid": "0fd9572a-37f4-4e7f-9279-c069874c5ef8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.364621] env[67820]: DEBUG oslo_concurrency.lockutils [req-3d6d2e14-bd6e-4984-8b24-7ed5f6c2812c req-a0a37d40-15ed-47e7-90a9-ce08ce40bdb4 service nova] Releasing lock "refresh_cache-ffe8063c-5dae-4e58-beca-f3a883d5d8df" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 755.621516] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 756.616541] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 757.622304] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 758.540894] env[67820]: WARNING oslo_vmware.rw_handles [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 758.540894] env[67820]: ERROR oslo_vmware.rw_handles [ 758.541319] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 758.542963] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 758.543271] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Copying Virtual Disk [datastore1] vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/1f4e1a9c-0304-4b44-9402-b7c7adbe182c/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 758.543537] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4ad645e4-81ff-42f0-be4f-4c7a3124d825 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.553734] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Waiting for the task: (returnval){ [ 758.553734] env[67820]: value = "task-3467324" [ 758.553734] env[67820]: _type = "Task" [ 758.553734] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 758.561749] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Task: {'id': task-3467324, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 758.621434] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 758.621673] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.065403] env[67820]: DEBUG oslo_vmware.exceptions [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 759.066124] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 759.067165] env[67820]: ERROR nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 759.067165] env[67820]: Faults: ['InvalidArgument'] [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Traceback (most recent call last): [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] yield resources [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self.driver.spawn(context, instance, image_meta, [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self._fetch_image_if_missing(context, vi) [ 759.067165] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] image_cache(vi, tmp_image_ds_loc) [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] vm_util.copy_virtual_disk( [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] session._wait_for_task(vmdk_copy_task) [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] return self.wait_for_task(task_ref) [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] return evt.wait() [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] result = hub.switch() [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.067599] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] return self.greenlet.switch() [ 759.068066] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 759.068066] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self.f(*self.args, **self.kw) [ 759.068066] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 759.068066] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] raise exceptions.translate_fault(task_info.error) [ 759.068066] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 759.068066] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Faults: ['InvalidArgument'] [ 759.068066] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] [ 759.068066] env[67820]: INFO nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Terminating instance [ 759.069768] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 759.071052] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 759.071052] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 759.071052] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 759.071221] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-142c7690-a34a-44ba-b197-1560b9158792 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.073592] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbe15873-bddc-4d44-823d-71483631e424 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.080460] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 759.080720] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5029b80c-3013-4b6b-b3f4-29f5861246e7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.083010] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 759.083234] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 759.084230] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fb2a6c4d-d489-4789-b2e7-96aef7b5f917 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.088999] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Waiting for the task: (returnval){ [ 759.088999] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d229d8-c56e-c9e1-4dff-8bf871a7d690" [ 759.088999] env[67820]: _type = "Task" [ 759.088999] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 759.096601] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d229d8-c56e-c9e1-4dff-8bf871a7d690, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 759.165064] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 759.165311] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 759.165491] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Deleting the datastore file [datastore1] 14478951-d2c1-4472-af0c-354757e0bb0b {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 759.165770] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3dd8e174-c7f4-48be-a8e5-1efa64389695 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.171823] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Waiting for the task: (returnval){ [ 759.171823] env[67820]: value = "task-3467326" [ 759.171823] env[67820]: _type = "Task" [ 759.171823] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 759.179413] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Task: {'id': task-3467326, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 759.599600] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 759.599893] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Creating directory with path [datastore1] vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 759.600145] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5aa0bc63-f4ee-496c-8448-b84808272e6b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.612034] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Created directory with path [datastore1] vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 759.612034] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Fetch image to [datastore1] vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 759.612034] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 759.612579] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cecdf032-a472-4d8d-b73a-79b87ad1ba27 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.616487] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.619313] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef0d3dc1-acc4-4916-8303-f41e1048b14e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.621772] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.622111] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 759.622261] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 759.630116] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d537abed-462a-4663-9107-f476f9549020 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.661474] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00efdae6-a012-417d-8ac6-a483413c8ac2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.665832] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.665935] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666097] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666165] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666267] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666387] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666504] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666817] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666817] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666993] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 759.666993] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 759.667441] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.667993] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 759.668156] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 759.671178] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5407bca1-e2d3-49ff-9296-519fca125ef3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.680735] env[67820]: DEBUG oslo_vmware.api [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Task: {'id': task-3467326, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067715} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 759.680735] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 759.680855] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 759.681015] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 759.681201] env[67820]: INFO nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Took 0.61 seconds to destroy the instance on the hypervisor. [ 759.683840] env[67820]: DEBUG nova.compute.claims [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 759.684013] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.684241] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 759.697080] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 759.751056] env[67820]: DEBUG oslo_vmware.rw_handles [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 759.809323] env[67820]: DEBUG oslo_vmware.rw_handles [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 759.809520] env[67820]: DEBUG oslo_vmware.rw_handles [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 760.156833] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-229a4414-9789-42f5-ba92-ee7013f1e0e4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.164526] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f84115-fe12-460d-be30-a3a7a8123e5b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.194410] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04ed85dc-2fda-4bcd-a9d1-3806bf258e41 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.201693] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70ce0e91-4d08-4f3a-82d9-f08ab8344383 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.215088] env[67820]: DEBUG nova.compute.provider_tree [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.224165] env[67820]: DEBUG nova.scheduler.client.report [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.241025] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.554s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 760.241025] env[67820]: ERROR nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.241025] env[67820]: Faults: ['InvalidArgument'] [ 760.241025] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Traceback (most recent call last): [ 760.241025] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 760.241025] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self.driver.spawn(context, instance, image_meta, [ 760.241025] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 760.241025] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.241025] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 760.241025] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self._fetch_image_if_missing(context, vi) [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] image_cache(vi, tmp_image_ds_loc) [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] vm_util.copy_virtual_disk( [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] session._wait_for_task(vmdk_copy_task) [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] return self.wait_for_task(task_ref) [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] return evt.wait() [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] result = hub.switch() [ 760.241423] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] return self.greenlet.switch() [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] self.f(*self.args, **self.kw) [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] raise exceptions.translate_fault(task_info.error) [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Faults: ['InvalidArgument'] [ 760.241710] env[67820]: ERROR nova.compute.manager [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] [ 760.241710] env[67820]: DEBUG nova.compute.utils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 760.241710] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Build of instance 14478951-d2c1-4472-af0c-354757e0bb0b was re-scheduled: A specified parameter was not correct: fileType [ 760.241952] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 760.241952] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 760.241952] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 760.241952] env[67820]: DEBUG nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 760.242079] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.582346] env[67820]: DEBUG nova.network.neutron [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.595447] env[67820]: INFO nova.compute.manager [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] [instance: 14478951-d2c1-4472-af0c-354757e0bb0b] Took 0.35 seconds to deallocate network for instance. [ 760.621496] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 760.631560] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.631788] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.631949] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 760.632119] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 760.633271] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9edb8334-67e1-4d0f-b5a0-fe2d45dcc92d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.648740] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08cdf310-3f62-4298-9299-29ed4f28463d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.661648] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-effaa57d-c0f3-4c31-80ce-eb72afc1f481 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.669883] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e37d454-d8bc-4cc5-b822-e4153b74ecb1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.702310] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180923MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 760.702310] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.702310] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.730727] env[67820]: INFO nova.scheduler.client.report [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Deleted allocations for instance 14478951-d2c1-4472-af0c-354757e0bb0b [ 760.757836] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db34ba0f-61f7-486a-a43b-d8bdec04b772 tempest-FloatingIPsAssociationTestJSON-1489258656 tempest-FloatingIPsAssociationTestJSON-1489258656-project-member] Lock "14478951-d2c1-4472-af0c-354757e0bb0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 194.813s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 760.768401] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 760.837282] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.837561] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance be35d888-f649-44e4-af23-341b8bfc81f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.838210] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 573a28e3-bfc4-4b08-919b-65acbca79c7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.838867] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1cc3b207-a628-4fe5-8908-6879483806b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.839144] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 80f480dc-9bb8-4764-9b6b-793c0954962e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.839382] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.840175] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.840340] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.840497] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 760.842771] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.852496] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.866371] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8909b0c9-f236-4d22-b3f1-3bf15b82aa0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.879282] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ed58f91c-e284-4877-8d7c-f1f9e9f1add8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.894426] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ed56b60-f2a0-4ada-ace0-3b85d93693f1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.906413] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 7fc2bae2-34a4-472a-a097-f93245de32bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.917748] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 18d924ce-620d-45d1-92cf-3f8cfa5a81b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.929876] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 66add717-36cf-4328-8d8c-32beb0eca333 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.941309] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5a22e980-f0c4-4fca-a4bf-e5c1347e9b80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.951729] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 91d81e24-4166-4135-9a3b-fe117bdf9c2d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.967797] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d41bef54-918d-4a67-9666-4beb8bd1d1dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.979439] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance b566340c-f89b-43e3-afae-83b67d4b169c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.988904] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 87d3ff6f-df3e-4fbe-98b3-98878945da63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 760.999556] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 6c978d86-e29b-4fd6-99e6-1ac37678871d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.009869] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 08b4411d-6ed5-453a-92f2-1dd0b1ee2140 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.021245] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.034359] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5feadca8-33f0-4dac-8e1d-162c77919c77 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.046177] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.058856] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance da533036-7e32-4078-9060-6ee7680cba5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.073295] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4e356219-fa88-474c-97fe-6f6a6ef0c90d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.085086] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 454a0392-c614-4ca8-903e-48efa44be22f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.096482] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 90319a68-5613-4b18-91d3-b606d258ced9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.107932] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance b80e4804-6f23-4059-8e9c-bf8ecdc2efc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.118965] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8e49f35a-037b-4db4-8bec-005b50905852 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.129473] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.142210] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 761.142331] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 761.142448] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 761.529262] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c049a86a-5fea-4202-be5f-7dabd8443e6a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.537402] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7e4f86f-edcb-4e43-a427-2bb2512fcbe7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.569845] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8407066-a128-44eb-821c-8a18680a63cf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.575596] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f71eee7-5c37-4b33-bfdf-83492fe6fd6a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.590069] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 761.599573] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 761.614696] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 761.614902] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.913s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 761.615202] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.772s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 761.616737] env[67820]: INFO nova.compute.claims [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 762.116668] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2271ade-bd6c-4eed-8283-f8b39312e7cb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.125225] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d007e03-121d-47b7-abd6-52d86d72f991 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.156686] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f730593b-8dd9-4b22-a051-327a56785a3d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.166026] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c8e0146-07ee-42bc-ae82-7b152dbe2d71 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.180361] env[67820]: DEBUG nova.compute.provider_tree [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 762.189480] env[67820]: DEBUG nova.scheduler.client.report [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 762.206085] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.591s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 762.206679] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 762.246217] env[67820]: DEBUG nova.compute.utils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 762.247506] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 762.247702] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 762.259473] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 762.328699] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 762.359961] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 762.360230] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 762.360383] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 762.360561] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 762.360706] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 762.360849] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 762.361128] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 762.361352] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 762.361559] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 762.361755] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 762.361959] env[67820]: DEBUG nova.virt.hardware [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 762.362853] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9d4722d-f4c7-4ef6-a07c-d6775de6fcf6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 762.376488] env[67820]: DEBUG nova.policy [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00d455fe792d40679259447f557f84bd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '41734a2a52984cada05da499d40e56d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 762.379044] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbffabfa-4b73-4139-852c-1e4ac1f216b5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.195950] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Successfully created port: 73879774-1ae5-4f9f-b24a-6d670109eb14 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 764.161195] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Successfully updated port: 73879774-1ae5-4f9f-b24a-6d670109eb14 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 764.174568] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "refresh_cache-39420619-61c5-4e52-8236-d3abc3ef6f0f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 764.174655] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquired lock "refresh_cache-39420619-61c5-4e52-8236-d3abc3ef6f0f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 764.174754] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 764.238079] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 764.420152] env[67820]: DEBUG nova.compute.manager [req-b9e2c512-e6a5-4736-bc9e-d28424a3cb91 req-413b052f-8904-4964-9681-e090ca9598db service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Received event network-vif-plugged-73879774-1ae5-4f9f-b24a-6d670109eb14 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 764.420406] env[67820]: DEBUG oslo_concurrency.lockutils [req-b9e2c512-e6a5-4736-bc9e-d28424a3cb91 req-413b052f-8904-4964-9681-e090ca9598db service nova] Acquiring lock "39420619-61c5-4e52-8236-d3abc3ef6f0f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 764.420616] env[67820]: DEBUG oslo_concurrency.lockutils [req-b9e2c512-e6a5-4736-bc9e-d28424a3cb91 req-413b052f-8904-4964-9681-e090ca9598db service nova] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 764.420783] env[67820]: DEBUG oslo_concurrency.lockutils [req-b9e2c512-e6a5-4736-bc9e-d28424a3cb91 req-413b052f-8904-4964-9681-e090ca9598db service nova] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 764.420946] env[67820]: DEBUG nova.compute.manager [req-b9e2c512-e6a5-4736-bc9e-d28424a3cb91 req-413b052f-8904-4964-9681-e090ca9598db service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] No waiting events found dispatching network-vif-plugged-73879774-1ae5-4f9f-b24a-6d670109eb14 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 764.421205] env[67820]: WARNING nova.compute.manager [req-b9e2c512-e6a5-4736-bc9e-d28424a3cb91 req-413b052f-8904-4964-9681-e090ca9598db service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Received unexpected event network-vif-plugged-73879774-1ae5-4f9f-b24a-6d670109eb14 for instance with vm_state building and task_state spawning. [ 764.651491] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Updating instance_info_cache with network_info: [{"id": "73879774-1ae5-4f9f-b24a-6d670109eb14", "address": "fa:16:3e:21:01:b4", "network": {"id": "89e67c6c-3930-4e80-98dd-23f4da459da1", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-294951884-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41734a2a52984cada05da499d40e56d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73879774-1a", "ovs_interfaceid": "73879774-1ae5-4f9f-b24a-6d670109eb14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 764.663455] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Releasing lock "refresh_cache-39420619-61c5-4e52-8236-d3abc3ef6f0f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 764.663866] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Instance network_info: |[{"id": "73879774-1ae5-4f9f-b24a-6d670109eb14", "address": "fa:16:3e:21:01:b4", "network": {"id": "89e67c6c-3930-4e80-98dd-23f4da459da1", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-294951884-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41734a2a52984cada05da499d40e56d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73879774-1a", "ovs_interfaceid": "73879774-1ae5-4f9f-b24a-6d670109eb14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 764.664706] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:21:01:b4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a9abd00f-2cea-40f8-9804-a56b6431192d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '73879774-1ae5-4f9f-b24a-6d670109eb14', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 764.674465] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Creating folder: Project (41734a2a52984cada05da499d40e56d8). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 764.675144] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-abfb90a1-3e86-4c3c-a011-87d5d03e9066 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.686293] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Created folder: Project (41734a2a52984cada05da499d40e56d8) in parent group-v692668. [ 764.686519] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Creating folder: Instances. Parent ref: group-v692708. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 764.686772] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-98dd4395-c60c-4486-b5bb-a45518dafda4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.695395] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Created folder: Instances in parent group-v692708. [ 764.695625] env[67820]: DEBUG oslo.service.loopingcall [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 764.695802] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 764.695994] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4307a4ba-b5fd-4382-ba0f-d26a429e0b49 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.717468] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 764.717468] env[67820]: value = "task-3467329" [ 764.717468] env[67820]: _type = "Task" [ 764.717468] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 764.727578] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467329, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 765.227930] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467329, 'name': CreateVM_Task, 'duration_secs': 0.29931} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 765.228133] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 765.228845] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 765.228973] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 765.229521] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 765.229572] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a9e013b9-6cfd-4d7b-af19-093dedd64d2b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 765.237361] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Waiting for the task: (returnval){ [ 765.237361] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]524c35d2-abd6-37cf-1df8-e1d6c86d1474" [ 765.237361] env[67820]: _type = "Task" [ 765.237361] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 765.250607] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]524c35d2-abd6-37cf-1df8-e1d6c86d1474, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 765.747680] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 765.747929] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 765.748150] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 766.169298] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 766.733040] env[67820]: DEBUG nova.compute.manager [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Received event network-changed-73879774-1ae5-4f9f-b24a-6d670109eb14 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 766.733310] env[67820]: DEBUG nova.compute.manager [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Refreshing instance network info cache due to event network-changed-73879774-1ae5-4f9f-b24a-6d670109eb14. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 766.733827] env[67820]: DEBUG oslo_concurrency.lockutils [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] Acquiring lock "refresh_cache-39420619-61c5-4e52-8236-d3abc3ef6f0f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 766.733827] env[67820]: DEBUG oslo_concurrency.lockutils [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] Acquired lock "refresh_cache-39420619-61c5-4e52-8236-d3abc3ef6f0f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 766.733827] env[67820]: DEBUG nova.network.neutron [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Refreshing network info cache for port 73879774-1ae5-4f9f-b24a-6d670109eb14 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 767.289710] env[67820]: DEBUG nova.network.neutron [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Updated VIF entry in instance network info cache for port 73879774-1ae5-4f9f-b24a-6d670109eb14. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 767.289710] env[67820]: DEBUG nova.network.neutron [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Updating instance_info_cache with network_info: [{"id": "73879774-1ae5-4f9f-b24a-6d670109eb14", "address": "fa:16:3e:21:01:b4", "network": {"id": "89e67c6c-3930-4e80-98dd-23f4da459da1", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-294951884-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41734a2a52984cada05da499d40e56d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73879774-1a", "ovs_interfaceid": "73879774-1ae5-4f9f-b24a-6d670109eb14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 767.291729] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "be35d888-f649-44e4-af23-341b8bfc81f6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 767.301458] env[67820]: DEBUG oslo_concurrency.lockutils [req-92aa3004-1182-4dd2-8832-6d29c34e74fd req-ce28f017-2293-4fa2-a3d3-0542d9a2f60f service nova] Releasing lock "refresh_cache-39420619-61c5-4e52-8236-d3abc3ef6f0f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 768.328515] env[67820]: DEBUG oslo_concurrency.lockutils [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "573a28e3-bfc4-4b08-919b-65acbca79c7b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 773.803525] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "1cc3b207-a628-4fe5-8908-6879483806b9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 778.080136] env[67820]: DEBUG oslo_concurrency.lockutils [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "80f480dc-9bb8-4764-9b6b-793c0954962e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.220151] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "31ec9cab-abfb-4a73-8df8-057670201267" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.220438] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "31ec9cab-abfb-4a73-8df8-057670201267" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 781.410748] env[67820]: DEBUG oslo_concurrency.lockutils [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.719318] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 781.917317] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 787.375285] env[67820]: DEBUG oslo_concurrency.lockutils [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.735335] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 794.537209] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5a234b1e-0871-441b-9257-171ad7c3f418 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Acquiring lock "49515d53-7359-4bc6-8a26-95ffd4fe4ed4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 794.537497] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5a234b1e-0871-441b-9257-171ad7c3f418 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "49515d53-7359-4bc6-8a26-95ffd4fe4ed4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 805.624821] env[67820]: WARNING oslo_vmware.rw_handles [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 805.624821] env[67820]: ERROR oslo_vmware.rw_handles [ 805.625389] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 805.628972] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 805.629297] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Copying Virtual Disk [datastore1] vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/86e177d5-1e78-47c0-b90c-a019820a29ab/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 805.629646] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d3056eac-c8a5-4c24-ae1f-7ee6cb020039 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 805.640601] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Waiting for the task: (returnval){ [ 805.640601] env[67820]: value = "task-3467330" [ 805.640601] env[67820]: _type = "Task" [ 805.640601] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 805.651942] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Task: {'id': task-3467330, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 806.050042] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 806.050042] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 806.152697] env[67820]: DEBUG oslo_vmware.exceptions [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 806.153078] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 806.153695] env[67820]: ERROR nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 806.153695] env[67820]: Faults: ['InvalidArgument'] [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Traceback (most recent call last): [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] yield resources [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self.driver.spawn(context, instance, image_meta, [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self._fetch_image_if_missing(context, vi) [ 806.153695] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] image_cache(vi, tmp_image_ds_loc) [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] vm_util.copy_virtual_disk( [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] session._wait_for_task(vmdk_copy_task) [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] return self.wait_for_task(task_ref) [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] return evt.wait() [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] result = hub.switch() [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 806.154063] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] return self.greenlet.switch() [ 806.154386] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 806.154386] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self.f(*self.args, **self.kw) [ 806.154386] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 806.154386] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] raise exceptions.translate_fault(task_info.error) [ 806.154386] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 806.154386] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Faults: ['InvalidArgument'] [ 806.154386] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] [ 806.155009] env[67820]: INFO nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Terminating instance [ 806.158188] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 806.158508] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 806.158914] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 806.159101] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 806.160086] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-514f843f-1577-4359-9727-9b49959805d5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.164622] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8de3dc28-5874-49ae-9bb7-e9a20c6e5b91 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.174172] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 806.174172] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1fe6797e-f03b-4dd7-af46-1047851518f4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.177071] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 806.177466] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 806.178782] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-37f8d16f-b6e2-410b-ac65-ff61a8c0bd58 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.186081] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Waiting for the task: (returnval){ [ 806.186081] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5282319d-c52e-1b72-185a-bd86a7ec042d" [ 806.186081] env[67820]: _type = "Task" [ 806.186081] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 806.196607] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5282319d-c52e-1b72-185a-bd86a7ec042d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 806.275757] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 806.276108] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 806.280013] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Deleting the datastore file [datastore1] 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 806.280013] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-aa1a47f4-40fb-405b-8184-26717e23991a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.283463] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Waiting for the task: (returnval){ [ 806.283463] env[67820]: value = "task-3467332" [ 806.283463] env[67820]: _type = "Task" [ 806.283463] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 806.296946] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Task: {'id': task-3467332, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 806.704380] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 806.707960] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Creating directory with path [datastore1] vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 806.707960] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4fe23b01-48ca-48d7-b5f4-618e1f9f8724 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.734160] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Created directory with path [datastore1] vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 806.734380] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Fetch image to [datastore1] vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 806.738106] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 806.738413] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c587ebe9-b6b5-42cd-9a8d-af8ec8970a56 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.750296] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8105e392-cb35-40ee-bb80-59c5254d968b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.763966] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e54b3d34-869f-4ec4-861d-0695d552e6a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.807114] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50eaab8-148b-4bcb-beed-0c29ceedee94 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.818326] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4556f832-05ee-4225-8716-ba2b40465580 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 806.818551] env[67820]: DEBUG oslo_vmware.api [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Task: {'id': task-3467332, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.28542} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 806.818721] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 806.818933] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 806.819192] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 806.819341] env[67820]: INFO nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Took 0.66 seconds to destroy the instance on the hypervisor. [ 806.821442] env[67820]: DEBUG nova.compute.claims [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 806.821652] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 806.821875] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 806.842368] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 806.916312] env[67820]: DEBUG oslo_vmware.rw_handles [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 806.983653] env[67820]: DEBUG oslo_vmware.rw_handles [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 806.983887] env[67820]: DEBUG oslo_vmware.rw_handles [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 807.370984] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6160267-f100-4c15-ab84-6c2eec9f2bbc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.381216] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7082a36-9149-4298-ad3d-9089744d6bfd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.416556] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1cecbcc-bba2-4ef3-956d-6f24bd026456 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.424984] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18028787-0dc8-43d5-9361-629bab075ef4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 807.439725] env[67820]: DEBUG nova.compute.provider_tree [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 807.457290] env[67820]: DEBUG nova.scheduler.client.report [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 807.476554] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.652s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 807.476554] env[67820]: ERROR nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 807.476554] env[67820]: Faults: ['InvalidArgument'] [ 807.476554] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Traceback (most recent call last): [ 807.476554] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 807.476554] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self.driver.spawn(context, instance, image_meta, [ 807.476554] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 807.476554] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 807.476554] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 807.476554] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self._fetch_image_if_missing(context, vi) [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] image_cache(vi, tmp_image_ds_loc) [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] vm_util.copy_virtual_disk( [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] session._wait_for_task(vmdk_copy_task) [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] return self.wait_for_task(task_ref) [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] return evt.wait() [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] result = hub.switch() [ 807.476941] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] return self.greenlet.switch() [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] self.f(*self.args, **self.kw) [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] raise exceptions.translate_fault(task_info.error) [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Faults: ['InvalidArgument'] [ 807.477319] env[67820]: ERROR nova.compute.manager [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] [ 807.477319] env[67820]: DEBUG nova.compute.utils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 807.479839] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Build of instance 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 was re-scheduled: A specified parameter was not correct: fileType [ 807.479839] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 807.480295] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 807.480477] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 807.480631] env[67820]: DEBUG nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 807.480788] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 808.173054] env[67820]: DEBUG nova.network.neutron [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 808.188622] env[67820]: INFO nova.compute.manager [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Took 0.71 seconds to deallocate network for instance. [ 808.341698] env[67820]: INFO nova.scheduler.client.report [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Deleted allocations for instance 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 [ 808.374720] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f9d8989b-ad7b-4133-95e4-868155045899 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.846s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.378699] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 42.209s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 808.378957] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 808.379210] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 808.379409] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.384131] env[67820]: INFO nova.compute.manager [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Terminating instance [ 808.386351] env[67820]: DEBUG nova.compute.manager [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 808.386533] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 808.387262] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-98a223b8-2b98-4a4d-b68f-700ee4978486 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.399506] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-888e52ed-efdd-4722-98a8-a1c9e91e79ed {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 808.415833] env[67820]: DEBUG nova.compute.manager [None req-efc88ee7-caa1-4bc4-bac7-632d96f2c478 tempest-ServersAdminNegativeTestJSON-752970805 tempest-ServersAdminNegativeTestJSON-752970805-project-member] [instance: 8909b0c9-f236-4d22-b3f1-3bf15b82aa0e] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 808.440831] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0421f92d-8c91-4a60-beb9-f1a799e6d1b4 could not be found. [ 808.440831] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 808.440831] env[67820]: INFO nova.compute.manager [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 808.440831] env[67820]: DEBUG oslo.service.loopingcall [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 808.440831] env[67820]: DEBUG nova.compute.manager [-] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 808.441569] env[67820]: DEBUG nova.network.neutron [-] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 808.462820] env[67820]: DEBUG nova.compute.manager [None req-efc88ee7-caa1-4bc4-bac7-632d96f2c478 tempest-ServersAdminNegativeTestJSON-752970805 tempest-ServersAdminNegativeTestJSON-752970805-project-member] [instance: 8909b0c9-f236-4d22-b3f1-3bf15b82aa0e] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 808.474698] env[67820]: DEBUG nova.network.neutron [-] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 808.498589] env[67820]: INFO nova.compute.manager [-] [instance: 0421f92d-8c91-4a60-beb9-f1a799e6d1b4] Took 0.06 seconds to deallocate network for instance. [ 808.502313] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efc88ee7-caa1-4bc4-bac7-632d96f2c478 tempest-ServersAdminNegativeTestJSON-752970805 tempest-ServersAdminNegativeTestJSON-752970805-project-member] Lock "8909b0c9-f236-4d22-b3f1-3bf15b82aa0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.391s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.521304] env[67820]: DEBUG nova.compute.manager [None req-cfd23513-cb83-4b8e-90e6-eb9e5ae733d3 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: ed58f91c-e284-4877-8d7c-f1f9e9f1add8] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 808.600060] env[67820]: DEBUG nova.compute.manager [None req-cfd23513-cb83-4b8e-90e6-eb9e5ae733d3 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: ed58f91c-e284-4877-8d7c-f1f9e9f1add8] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 808.678601] env[67820]: DEBUG oslo_concurrency.lockutils [None req-cfd23513-cb83-4b8e-90e6-eb9e5ae733d3 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "ed58f91c-e284-4877-8d7c-f1f9e9f1add8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.479s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.703771] env[67820]: DEBUG nova.compute.manager [None req-226f1a8b-62a1-44b3-8f59-ea4a403d7d54 tempest-ServersAdmin275Test-1182303969 tempest-ServersAdmin275Test-1182303969-project-member] [instance: 3ed56b60-f2a0-4ada-ace0-3b85d93693f1] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 808.710129] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff79f0f-7d00-4205-9e54-136c26a1e095 tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "0421f92d-8c91-4a60-beb9-f1a799e6d1b4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.330s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.738121] env[67820]: DEBUG nova.compute.manager [None req-226f1a8b-62a1-44b3-8f59-ea4a403d7d54 tempest-ServersAdmin275Test-1182303969 tempest-ServersAdmin275Test-1182303969-project-member] [instance: 3ed56b60-f2a0-4ada-ace0-3b85d93693f1] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 808.767251] env[67820]: DEBUG oslo_concurrency.lockutils [None req-226f1a8b-62a1-44b3-8f59-ea4a403d7d54 tempest-ServersAdmin275Test-1182303969 tempest-ServersAdmin275Test-1182303969-project-member] Lock "3ed56b60-f2a0-4ada-ace0-3b85d93693f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.749s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.777799] env[67820]: DEBUG nova.compute.manager [None req-e8b4a595-43d9-49eb-b7c4-4f3887c6469d tempest-ServersWithSpecificFlavorTestJSON-693409077 tempest-ServersWithSpecificFlavorTestJSON-693409077-project-member] [instance: 7fc2bae2-34a4-472a-a097-f93245de32bd] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 808.812566] env[67820]: DEBUG nova.compute.manager [None req-e8b4a595-43d9-49eb-b7c4-4f3887c6469d tempest-ServersWithSpecificFlavorTestJSON-693409077 tempest-ServersWithSpecificFlavorTestJSON-693409077-project-member] [instance: 7fc2bae2-34a4-472a-a097-f93245de32bd] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 808.849415] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e8b4a595-43d9-49eb-b7c4-4f3887c6469d tempest-ServersWithSpecificFlavorTestJSON-693409077 tempest-ServersWithSpecificFlavorTestJSON-693409077-project-member] Lock "7fc2bae2-34a4-472a-a097-f93245de32bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.511s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.864771] env[67820]: DEBUG nova.compute.manager [None req-4d4b2146-9c6e-4365-adfa-a75355122047 tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] [instance: 18d924ce-620d-45d1-92cf-3f8cfa5a81b9] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 808.894859] env[67820]: DEBUG nova.compute.manager [None req-4d4b2146-9c6e-4365-adfa-a75355122047 tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] [instance: 18d924ce-620d-45d1-92cf-3f8cfa5a81b9] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 808.920406] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4d4b2146-9c6e-4365-adfa-a75355122047 tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] Lock "18d924ce-620d-45d1-92cf-3f8cfa5a81b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.533s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 808.936399] env[67820]: DEBUG nova.compute.manager [None req-11420d02-e21b-4a3d-819f-c540282d5490 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] [instance: 66add717-36cf-4328-8d8c-32beb0eca333] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 808.977534] env[67820]: DEBUG nova.compute.manager [None req-11420d02-e21b-4a3d-819f-c540282d5490 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] [instance: 66add717-36cf-4328-8d8c-32beb0eca333] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.033165] env[67820]: DEBUG oslo_concurrency.lockutils [None req-11420d02-e21b-4a3d-819f-c540282d5490 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] Lock "66add717-36cf-4328-8d8c-32beb0eca333" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.488s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.045674] env[67820]: DEBUG nova.compute.manager [None req-7f33c58f-d3c2-4536-b751-42f9e23b550f tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] [instance: 5a22e980-f0c4-4fca-a4bf-e5c1347e9b80] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.091224] env[67820]: DEBUG nova.compute.manager [None req-7f33c58f-d3c2-4536-b751-42f9e23b550f tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] [instance: 5a22e980-f0c4-4fca-a4bf-e5c1347e9b80] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.147538] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7f33c58f-d3c2-4536-b751-42f9e23b550f tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] Lock "5a22e980-f0c4-4fca-a4bf-e5c1347e9b80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.765s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.165672] env[67820]: DEBUG nova.compute.manager [None req-66665645-914b-49c0-ac8a-35ec51f6ee00 tempest-ServersTestJSON-243953592 tempest-ServersTestJSON-243953592-project-member] [instance: 91d81e24-4166-4135-9a3b-fe117bdf9c2d] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.201698] env[67820]: DEBUG nova.compute.manager [None req-66665645-914b-49c0-ac8a-35ec51f6ee00 tempest-ServersTestJSON-243953592 tempest-ServersTestJSON-243953592-project-member] [instance: 91d81e24-4166-4135-9a3b-fe117bdf9c2d] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.240257] env[67820]: DEBUG oslo_concurrency.lockutils [None req-66665645-914b-49c0-ac8a-35ec51f6ee00 tempest-ServersTestJSON-243953592 tempest-ServersTestJSON-243953592-project-member] Lock "91d81e24-4166-4135-9a3b-fe117bdf9c2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.452s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.256064] env[67820]: DEBUG nova.compute.manager [None req-1c5c390a-9fce-4725-928b-bc6ea81abc4e tempest-ServerDiagnosticsNegativeTest-1545651443 tempest-ServerDiagnosticsNegativeTest-1545651443-project-member] [instance: d41bef54-918d-4a67-9666-4beb8bd1d1dc] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.293232] env[67820]: DEBUG nova.compute.manager [None req-1c5c390a-9fce-4725-928b-bc6ea81abc4e tempest-ServerDiagnosticsNegativeTest-1545651443 tempest-ServerDiagnosticsNegativeTest-1545651443-project-member] [instance: d41bef54-918d-4a67-9666-4beb8bd1d1dc] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.318941] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1c5c390a-9fce-4725-928b-bc6ea81abc4e tempest-ServerDiagnosticsNegativeTest-1545651443 tempest-ServerDiagnosticsNegativeTest-1545651443-project-member] Lock "d41bef54-918d-4a67-9666-4beb8bd1d1dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.198s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.330245] env[67820]: DEBUG nova.compute.manager [None req-39398c22-bea1-450a-80fe-4d59fe3c23ba tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] [instance: b566340c-f89b-43e3-afae-83b67d4b169c] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.364056] env[67820]: DEBUG nova.compute.manager [None req-39398c22-bea1-450a-80fe-4d59fe3c23ba tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] [instance: b566340c-f89b-43e3-afae-83b67d4b169c] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.398610] env[67820]: DEBUG oslo_concurrency.lockutils [None req-39398c22-bea1-450a-80fe-4d59fe3c23ba tempest-ListImageFiltersTestJSON-1311827799 tempest-ListImageFiltersTestJSON-1311827799-project-member] Lock "b566340c-f89b-43e3-afae-83b67d4b169c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.101s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.412412] env[67820]: DEBUG nova.compute.manager [None req-3f22e63e-365c-49c5-87d3-8ddc7c25112c tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] [instance: 87d3ff6f-df3e-4fbe-98b3-98878945da63] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.446706] env[67820]: DEBUG nova.compute.manager [None req-3f22e63e-365c-49c5-87d3-8ddc7c25112c tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] [instance: 87d3ff6f-df3e-4fbe-98b3-98878945da63] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.471528] env[67820]: DEBUG oslo_concurrency.lockutils [None req-3f22e63e-365c-49c5-87d3-8ddc7c25112c tempest-ServerRescueNegativeTestJSON-35720632 tempest-ServerRescueNegativeTestJSON-35720632-project-member] Lock "87d3ff6f-df3e-4fbe-98b3-98878945da63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.147s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.491474] env[67820]: DEBUG nova.compute.manager [None req-a7e1b9e3-e593-4d7d-ab46-8dab1e3e85f8 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] [instance: 6c978d86-e29b-4fd6-99e6-1ac37678871d] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.525489] env[67820]: DEBUG nova.compute.manager [None req-a7e1b9e3-e593-4d7d-ab46-8dab1e3e85f8 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] [instance: 6c978d86-e29b-4fd6-99e6-1ac37678871d] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.564111] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a7e1b9e3-e593-4d7d-ab46-8dab1e3e85f8 tempest-ServerShowV247Test-2142907380 tempest-ServerShowV247Test-2142907380-project-member] Lock "6c978d86-e29b-4fd6-99e6-1ac37678871d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.139s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.577182] env[67820]: DEBUG nova.compute.manager [None req-d987cfa7-92a5-450c-b419-beb7ddb9fb93 tempest-ImagesOneServerTestJSON-1009029765 tempest-ImagesOneServerTestJSON-1009029765-project-member] [instance: 08b4411d-6ed5-453a-92f2-1dd0b1ee2140] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.615330] env[67820]: DEBUG nova.compute.manager [None req-d987cfa7-92a5-450c-b419-beb7ddb9fb93 tempest-ImagesOneServerTestJSON-1009029765 tempest-ImagesOneServerTestJSON-1009029765-project-member] [instance: 08b4411d-6ed5-453a-92f2-1dd0b1ee2140] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 809.644515] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d987cfa7-92a5-450c-b419-beb7ddb9fb93 tempest-ImagesOneServerTestJSON-1009029765 tempest-ImagesOneServerTestJSON-1009029765-project-member] Lock "08b4411d-6ed5-453a-92f2-1dd0b1ee2140" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.273s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 809.654719] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 809.731133] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 809.731478] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 809.733039] env[67820]: INFO nova.compute.claims [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 810.183040] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70029094-dd86-4753-a5c0-03458621c136 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.192908] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21d6aad5-4a40-45c3-8fd2-75d954094454 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.231185] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37f6c63f-e164-46e6-9322-20cdc19d01db {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.239662] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d2679c6-44d9-483b-bd23-cd06e9dc5453 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.255052] env[67820]: DEBUG nova.compute.provider_tree [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 810.263939] env[67820]: DEBUG nova.scheduler.client.report [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 810.281829] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.550s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 810.282394] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 810.335836] env[67820]: DEBUG nova.compute.utils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 810.337465] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 810.337558] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 810.351330] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 810.446586] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 810.465150] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8bb51154-3876-46f5-a1e7-ac76a980bb7f tempest-ServerMetadataNegativeTestJSON-1278293935 tempest-ServerMetadataNegativeTestJSON-1278293935-project-member] Acquiring lock "22905445-5120-4cdd-b965-099001e4147c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.465150] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8bb51154-3876-46f5-a1e7-ac76a980bb7f tempest-ServerMetadataNegativeTestJSON-1278293935 tempest-ServerMetadataNegativeTestJSON-1278293935-project-member] Lock "22905445-5120-4cdd-b965-099001e4147c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.491116] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 810.491116] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 810.491116] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 810.491354] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 810.491354] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 810.491556] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 810.492462] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 810.492783] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 810.493120] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 810.493517] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 810.495097] env[67820]: DEBUG nova.virt.hardware [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 810.495215] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5431167-c51b-4750-9506-2090eea3fc19 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.504558] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b27bb7da-894b-4c92-ac75-6ef815015416 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 810.510414] env[67820]: DEBUG nova.policy [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '66ea121e7ff344fc8146bfc339df6204', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b9a34ffe740c4fc58bc6381f7c3c3b11', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 811.180143] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Successfully created port: 5c711a83-d45e-4424-97f7-06cee9385803 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 811.532505] env[67820]: DEBUG oslo_concurrency.lockutils [None req-06922a7e-d854-425d-8436-990306a135a1 tempest-ServersNegativeTestJSON-141145616 tempest-ServersNegativeTestJSON-141145616-project-member] Acquiring lock "37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.533392] env[67820]: DEBUG oslo_concurrency.lockutils [None req-06922a7e-d854-425d-8436-990306a135a1 tempest-ServersNegativeTestJSON-141145616 tempest-ServersNegativeTestJSON-141145616-project-member] Lock "37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.350900] env[67820]: DEBUG oslo_concurrency.lockutils [None req-88a74e22-7e18-4f46-995e-f8ddad65584e tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Acquiring lock "53757aa2-7013-42a5-94bd-e831c8f08c40" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 812.351152] env[67820]: DEBUG oslo_concurrency.lockutils [None req-88a74e22-7e18-4f46-995e-f8ddad65584e tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "53757aa2-7013-42a5-94bd-e831c8f08c40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.417483] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Successfully updated port: 5c711a83-d45e-4424-97f7-06cee9385803 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 812.437328] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "refresh_cache-906c40dd-b6d6-492a-aa51-58901959a60d" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 812.437490] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquired lock "refresh_cache-906c40dd-b6d6-492a-aa51-58901959a60d" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 812.437642] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 812.484769] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 812.698251] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Updating instance_info_cache with network_info: [{"id": "5c711a83-d45e-4424-97f7-06cee9385803", "address": "fa:16:3e:ca:0a:2a", "network": {"id": "9eca80a5-57c2-495e-8b54-8e9c797b9d88", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-978665872-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b9a34ffe740c4fc58bc6381f7c3c3b11", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c711a83-d4", "ovs_interfaceid": "5c711a83-d45e-4424-97f7-06cee9385803", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 812.715875] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Releasing lock "refresh_cache-906c40dd-b6d6-492a-aa51-58901959a60d" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 812.716207] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Instance network_info: |[{"id": "5c711a83-d45e-4424-97f7-06cee9385803", "address": "fa:16:3e:ca:0a:2a", "network": {"id": "9eca80a5-57c2-495e-8b54-8e9c797b9d88", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-978665872-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b9a34ffe740c4fc58bc6381f7c3c3b11", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c711a83-d4", "ovs_interfaceid": "5c711a83-d45e-4424-97f7-06cee9385803", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 812.717242] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ca:0a:2a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ee9ce73d-4ee8-4b28-b7d3-3a5735039627', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5c711a83-d45e-4424-97f7-06cee9385803', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 812.725426] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Creating folder: Project (b9a34ffe740c4fc58bc6381f7c3c3b11). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 812.726508] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f32d1960-4ef2-47f2-8a0f-b0d4ab4d78cc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.738202] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Created folder: Project (b9a34ffe740c4fc58bc6381f7c3c3b11) in parent group-v692668. [ 812.738407] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Creating folder: Instances. Parent ref: group-v692711. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 812.738644] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-526f0870-1861-40b5-8f35-44846e803b96 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.748030] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Created folder: Instances in parent group-v692711. [ 812.748142] env[67820]: DEBUG oslo.service.loopingcall [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 812.748332] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 812.748567] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c20d33f5-1638-4b49-a4ff-dfe8900ad386 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.771068] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 812.771068] env[67820]: value = "task-3467335" [ 812.771068] env[67820]: _type = "Task" [ 812.771068] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 812.779346] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467335, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 812.958977] env[67820]: DEBUG nova.compute.manager [req-a001b638-65d6-4d17-a7b0-f9d7222ce7d5 req-79940d19-e968-442e-9081-3fcbd6543f84 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Received event network-vif-plugged-5c711a83-d45e-4424-97f7-06cee9385803 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 812.959224] env[67820]: DEBUG oslo_concurrency.lockutils [req-a001b638-65d6-4d17-a7b0-f9d7222ce7d5 req-79940d19-e968-442e-9081-3fcbd6543f84 service nova] Acquiring lock "906c40dd-b6d6-492a-aa51-58901959a60d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 812.959505] env[67820]: DEBUG oslo_concurrency.lockutils [req-a001b638-65d6-4d17-a7b0-f9d7222ce7d5 req-79940d19-e968-442e-9081-3fcbd6543f84 service nova] Lock "906c40dd-b6d6-492a-aa51-58901959a60d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.959629] env[67820]: DEBUG oslo_concurrency.lockutils [req-a001b638-65d6-4d17-a7b0-f9d7222ce7d5 req-79940d19-e968-442e-9081-3fcbd6543f84 service nova] Lock "906c40dd-b6d6-492a-aa51-58901959a60d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 812.960613] env[67820]: DEBUG nova.compute.manager [req-a001b638-65d6-4d17-a7b0-f9d7222ce7d5 req-79940d19-e968-442e-9081-3fcbd6543f84 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] No waiting events found dispatching network-vif-plugged-5c711a83-d45e-4424-97f7-06cee9385803 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 812.960613] env[67820]: WARNING nova.compute.manager [req-a001b638-65d6-4d17-a7b0-f9d7222ce7d5 req-79940d19-e968-442e-9081-3fcbd6543f84 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Received unexpected event network-vif-plugged-5c711a83-d45e-4424-97f7-06cee9385803 for instance with vm_state building and task_state spawning. [ 813.290020] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467335, 'name': CreateVM_Task, 'duration_secs': 0.339965} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 813.290020] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 813.290020] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 813.290020] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 813.290020] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 813.290318] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a821d31e-635d-4b9a-9f19-4a9bd48ff35f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.294170] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Waiting for the task: (returnval){ [ 813.294170] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52977234-10b6-9a5e-3aa0-2d6f801a0aca" [ 813.294170] env[67820]: _type = "Task" [ 813.294170] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 813.301962] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52977234-10b6-9a5e-3aa0-2d6f801a0aca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 813.804034] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 813.804329] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 813.804329] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 814.587251] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a161f849-d48c-49bb-b4e7-346d4f032f46 tempest-ServersTestManualDisk-1039961554 tempest-ServersTestManualDisk-1039961554-project-member] Acquiring lock "76ca6882-c1e0-4ae0-af8d-7d5673a13540" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 814.587690] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a161f849-d48c-49bb-b4e7-346d4f032f46 tempest-ServersTestManualDisk-1039961554 tempest-ServersTestManualDisk-1039961554-project-member] Lock "76ca6882-c1e0-4ae0-af8d-7d5673a13540" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 815.314553] env[67820]: DEBUG nova.compute.manager [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Received event network-changed-5c711a83-d45e-4424-97f7-06cee9385803 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 815.314782] env[67820]: DEBUG nova.compute.manager [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Refreshing instance network info cache due to event network-changed-5c711a83-d45e-4424-97f7-06cee9385803. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 815.314946] env[67820]: DEBUG oslo_concurrency.lockutils [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] Acquiring lock "refresh_cache-906c40dd-b6d6-492a-aa51-58901959a60d" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 815.315177] env[67820]: DEBUG oslo_concurrency.lockutils [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] Acquired lock "refresh_cache-906c40dd-b6d6-492a-aa51-58901959a60d" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 815.315255] env[67820]: DEBUG nova.network.neutron [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Refreshing network info cache for port 5c711a83-d45e-4424-97f7-06cee9385803 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 815.616135] env[67820]: DEBUG nova.network.neutron [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Updated VIF entry in instance network info cache for port 5c711a83-d45e-4424-97f7-06cee9385803. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 815.616870] env[67820]: DEBUG nova.network.neutron [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Updating instance_info_cache with network_info: [{"id": "5c711a83-d45e-4424-97f7-06cee9385803", "address": "fa:16:3e:ca:0a:2a", "network": {"id": "9eca80a5-57c2-495e-8b54-8e9c797b9d88", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-978665872-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b9a34ffe740c4fc58bc6381f7c3c3b11", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c711a83-d4", "ovs_interfaceid": "5c711a83-d45e-4424-97f7-06cee9385803", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 815.621633] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.621633] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.621831] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 815.628696] env[67820]: DEBUG oslo_concurrency.lockutils [req-33114fff-f921-498a-acdc-b5873264663c req-4ffda3cc-5eab-4a68-a263-89eea9f87388 service nova] Releasing lock "refresh_cache-906c40dd-b6d6-492a-aa51-58901959a60d" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 815.642901] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] There are 0 instances to clean {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 815.643655] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.643655] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances with incomplete migration {{(pid=67820) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 815.652341] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 818.660218] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.621429] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.622668] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 819.622822] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 819.651172] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.652530] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.652770] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653019] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653073] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653173] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653323] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653405] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653519] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653630] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 819.653747] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 819.654274] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.654443] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.654606] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 819.654736] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 820.620818] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 820.633070] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 820.633454] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.633547] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 820.633895] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 820.636699] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faabb5ca-5d84-4ed1-adf2-de880bc7f1d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.645686] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e6e79b8-2d69-449c-a479-bef394979a6d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.659805] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b82ad65-e6d1-4cdd-ba81-f3dc4f5c4292 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.669323] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59958d0a-52e5-4df7-8bbe-395e0a7a3d64 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.703902] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180948MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 820.704067] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 820.704279] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 820.860136] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance be35d888-f649-44e4-af23-341b8bfc81f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.860314] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 573a28e3-bfc4-4b08-919b-65acbca79c7b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.860444] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1cc3b207-a628-4fe5-8908-6879483806b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.860571] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 80f480dc-9bb8-4764-9b6b-793c0954962e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.860689] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.860806] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.860923] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.861050] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.861168] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.861283] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 820.877135] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5feadca8-33f0-4dac-8e1d-162c77919c77 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.886617] env[67820]: DEBUG oslo_concurrency.lockutils [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "906c40dd-b6d6-492a-aa51-58901959a60d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 820.889039] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.902632] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance da533036-7e32-4078-9060-6ee7680cba5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.915346] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4e356219-fa88-474c-97fe-6f6a6ef0c90d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.925281] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 454a0392-c614-4ca8-903e-48efa44be22f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.936047] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 90319a68-5613-4b18-91d3-b606d258ced9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.948038] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance b80e4804-6f23-4059-8e9c-bf8ecdc2efc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.959116] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8e49f35a-037b-4db4-8bec-005b50905852 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.970224] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 820.982387] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.001089] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.015346] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 49515d53-7359-4bc6-8a26-95ffd4fe4ed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.028035] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.039497] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 22905445-5120-4cdd-b965-099001e4147c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.051814] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.064928] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 53757aa2-7013-42a5-94bd-e831c8f08c40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.086055] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 76ca6882-c1e0-4ae0-af8d-7d5673a13540 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 821.086337] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 821.086489] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 821.105632] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing inventories for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 821.124015] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating ProviderTree inventory for provider 0f792661-ec04-4fc2-898f-e9860339eddd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 821.124242] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 821.143429] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing aggregate associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, aggregates: None {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 821.174800] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing trait associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, traits: COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 821.662781] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21d99272-72dd-40e3-98b0-f0e91d7dbacc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 821.673631] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d89829be-5b02-477f-9267-3101fc7d728e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 821.711642] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-777b7a20-f465-4b90-b36f-fe176b30cf2e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 821.722989] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be0c87d6-dca3-491a-b2f3-29aee3a5182d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 821.734148] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 821.745821] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 821.770302] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 821.771275] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.066s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 822.766690] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 822.767126] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 824.593785] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57399e52-706b-46da-916c-3f60fae914f1 tempest-ServersNegativeTestMultiTenantJSON-57916155 tempest-ServersNegativeTestMultiTenantJSON-57916155-project-member] Acquiring lock "f77a97a9-3c8c-4484-8948-dd6dc9dc4077" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 824.594509] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57399e52-706b-46da-916c-3f60fae914f1 tempest-ServersNegativeTestMultiTenantJSON-57916155 tempest-ServersNegativeTestMultiTenantJSON-57916155-project-member] Lock "f77a97a9-3c8c-4484-8948-dd6dc9dc4077" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 833.536102] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d71f314c-5f47-475e-a05e-4febc95e1c7f tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] Acquiring lock "128f3466-5304-44dc-a569-2ba894f5333c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 833.536634] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d71f314c-5f47-475e-a05e-4febc95e1c7f tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] Lock "128f3466-5304-44dc-a569-2ba894f5333c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 838.667435] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79f5bb8e-dfdb-4766-93c3-007a34828c33 tempest-ServerPasswordTestJSON-958337142 tempest-ServerPasswordTestJSON-958337142-project-member] Acquiring lock "25a4642e-155a-473f-953c-b0fedbb6eac0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 838.671635] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79f5bb8e-dfdb-4766-93c3-007a34828c33 tempest-ServerPasswordTestJSON-958337142 tempest-ServerPasswordTestJSON-958337142-project-member] Lock "25a4642e-155a-473f-953c-b0fedbb6eac0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 840.393989] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f0f99f06-0777-43a2-ad81-ef5b64fe9c5a tempest-ServerMetadataTestJSON-272358107 tempest-ServerMetadataTestJSON-272358107-project-member] Acquiring lock "8933061f-5489-447c-80ca-28d4a427349e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 840.394654] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f0f99f06-0777-43a2-ad81-ef5b64fe9c5a tempest-ServerMetadataTestJSON-272358107 tempest-ServerMetadataTestJSON-272358107-project-member] Lock "8933061f-5489-447c-80ca-28d4a427349e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 841.740854] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5e5baf12-b83d-4cad-83d4-d9143ad3b1f6 tempest-InstanceActionsV221TestJSON-1418511094 tempest-InstanceActionsV221TestJSON-1418511094-project-member] Acquiring lock "d46bcc6e-6c3a-4200-a7f3-4c571ba6d819" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 841.741202] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5e5baf12-b83d-4cad-83d4-d9143ad3b1f6 tempest-InstanceActionsV221TestJSON-1418511094 tempest-InstanceActionsV221TestJSON-1418511094-project-member] Lock "d46bcc6e-6c3a-4200-a7f3-4c571ba6d819" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 843.190696] env[67820]: DEBUG oslo_concurrency.lockutils [None req-68d36599-6926-41e4-ac0f-b2fba62cc1ca tempest-ServerAddressesNegativeTestJSON-708644920 tempest-ServerAddressesNegativeTestJSON-708644920-project-member] Acquiring lock "d962be83-2769-465b-8628-1c1b656b6830" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 843.191083] env[67820]: DEBUG oslo_concurrency.lockutils [None req-68d36599-6926-41e4-ac0f-b2fba62cc1ca tempest-ServerAddressesNegativeTestJSON-708644920 tempest-ServerAddressesNegativeTestJSON-708644920-project-member] Lock "d962be83-2769-465b-8628-1c1b656b6830" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 853.581862] env[67820]: WARNING oslo_vmware.rw_handles [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 853.581862] env[67820]: ERROR oslo_vmware.rw_handles [ 853.582545] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 853.584027] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 853.584287] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Copying Virtual Disk [datastore1] vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/48bc52a2-70bd-4531-bdc8-ac15ffe51f58/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 853.584595] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ea0e49f6-ec38-4296-9ed3-05c17b8aa5d5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.593035] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Waiting for the task: (returnval){ [ 853.593035] env[67820]: value = "task-3467336" [ 853.593035] env[67820]: _type = "Task" [ 853.593035] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 853.600513] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Task: {'id': task-3467336, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 854.103790] env[67820]: DEBUG oslo_vmware.exceptions [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 854.104095] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 854.104646] env[67820]: ERROR nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 854.104646] env[67820]: Faults: ['InvalidArgument'] [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Traceback (most recent call last): [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] yield resources [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self.driver.spawn(context, instance, image_meta, [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self._fetch_image_if_missing(context, vi) [ 854.104646] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] image_cache(vi, tmp_image_ds_loc) [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] vm_util.copy_virtual_disk( [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] session._wait_for_task(vmdk_copy_task) [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] return self.wait_for_task(task_ref) [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] return evt.wait() [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] result = hub.switch() [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 854.105043] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] return self.greenlet.switch() [ 854.105450] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 854.105450] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self.f(*self.args, **self.kw) [ 854.105450] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 854.105450] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] raise exceptions.translate_fault(task_info.error) [ 854.105450] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 854.105450] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Faults: ['InvalidArgument'] [ 854.105450] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] [ 854.105450] env[67820]: INFO nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Terminating instance [ 854.106525] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 854.106728] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 854.106967] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a4220aa2-b01c-4fd3-9462-df7879682962 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.109155] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 854.109347] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 854.110139] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49613c24-b59a-43cd-a33a-a5f9bda35467 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.117932] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 854.118161] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7ac7e484-8bf3-40ba-aa80-a61ab58a597e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.120199] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 854.120365] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 854.121306] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f8fe2da6-b345-415a-ab40-be217215334e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.126284] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Waiting for the task: (returnval){ [ 854.126284] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5286d488-82fa-40af-7e54-6ee3d646f5df" [ 854.126284] env[67820]: _type = "Task" [ 854.126284] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 854.133339] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5286d488-82fa-40af-7e54-6ee3d646f5df, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 854.185951] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 854.186227] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 854.186468] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Deleting the datastore file [datastore1] 573a28e3-bfc4-4b08-919b-65acbca79c7b {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 854.186808] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ed53c8df-6b93-4ef9-a781-a62bd7eda9e7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.194242] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Waiting for the task: (returnval){ [ 854.194242] env[67820]: value = "task-3467338" [ 854.194242] env[67820]: _type = "Task" [ 854.194242] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 854.202427] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Task: {'id': task-3467338, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 854.637030] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 854.637391] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Creating directory with path [datastore1] vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 854.637391] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b4bd320c-08e3-410c-85ed-e6a0efcac367 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.648939] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Created directory with path [datastore1] vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 854.649162] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Fetch image to [datastore1] vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 854.649333] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 854.650145] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2008a30-e893-4919-b116-ac6f9555de54 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.657037] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c21d7c2e-b1a7-4c7d-83a0-0074440d891e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.668347] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98f8f2ac-da24-4234-8e22-161dee92db22 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.703084] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-517c7c68-f897-4baf-a5cf-d195a76b38d0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.710489] env[67820]: DEBUG oslo_vmware.api [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Task: {'id': task-3467338, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077969} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 854.712301] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 854.712541] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 854.712753] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 854.713034] env[67820]: INFO nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 854.715853] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5f1b2e1e-e3fb-4417-a23a-15dc61e16c5c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.717040] env[67820]: DEBUG nova.compute.claims [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 854.717377] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 854.717603] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 854.740272] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 854.823090] env[67820]: DEBUG oslo_vmware.rw_handles [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 854.885029] env[67820]: DEBUG oslo_vmware.rw_handles [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 854.885029] env[67820]: DEBUG oslo_vmware.rw_handles [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 855.147844] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4750fd78-0370-45c9-aa70-1a1223759ae2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.155523] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc6cd1d6-c1ea-4436-aaac-2f5b78b78c1e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.185432] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78b0a5b9-d3a5-4bc7-b3e5-51922feb4dcd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.192356] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2cd51b3-f96d-4d3a-991e-c8f71b75d16e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.205428] env[67820]: DEBUG nova.compute.provider_tree [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 855.215532] env[67820]: DEBUG nova.scheduler.client.report [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 855.228438] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.511s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.228975] env[67820]: ERROR nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 855.228975] env[67820]: Faults: ['InvalidArgument'] [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Traceback (most recent call last): [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self.driver.spawn(context, instance, image_meta, [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self._fetch_image_if_missing(context, vi) [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] image_cache(vi, tmp_image_ds_loc) [ 855.228975] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] vm_util.copy_virtual_disk( [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] session._wait_for_task(vmdk_copy_task) [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] return self.wait_for_task(task_ref) [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] return evt.wait() [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] result = hub.switch() [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] return self.greenlet.switch() [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 855.229391] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] self.f(*self.args, **self.kw) [ 855.229812] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 855.229812] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] raise exceptions.translate_fault(task_info.error) [ 855.229812] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 855.229812] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Faults: ['InvalidArgument'] [ 855.229812] env[67820]: ERROR nova.compute.manager [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] [ 855.229812] env[67820]: DEBUG nova.compute.utils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 855.231063] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Build of instance 573a28e3-bfc4-4b08-919b-65acbca79c7b was re-scheduled: A specified parameter was not correct: fileType [ 855.231063] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 855.231443] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 855.231631] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 855.231799] env[67820]: DEBUG nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 855.231957] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 855.536626] env[67820]: DEBUG nova.network.neutron [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.547355] env[67820]: INFO nova.compute.manager [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Took 0.32 seconds to deallocate network for instance. [ 855.655136] env[67820]: INFO nova.scheduler.client.report [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Deleted allocations for instance 573a28e3-bfc4-4b08-919b-65acbca79c7b [ 855.677041] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7a12d423-60e7-449d-b110-1507a2256bfc tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 287.805s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.679928] env[67820]: DEBUG oslo_concurrency.lockutils [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 87.350s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 855.679928] env[67820]: DEBUG oslo_concurrency.lockutils [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Acquiring lock "573a28e3-bfc4-4b08-919b-65acbca79c7b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 855.679928] env[67820]: DEBUG oslo_concurrency.lockutils [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 855.680259] env[67820]: DEBUG oslo_concurrency.lockutils [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.681547] env[67820]: INFO nova.compute.manager [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Terminating instance [ 855.684532] env[67820]: DEBUG nova.compute.manager [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 855.684777] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 855.685081] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-815d0bf5-8d40-4a97-9a94-eb74ee077baf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.689669] env[67820]: DEBUG nova.compute.manager [None req-0fed6338-d74f-4e45-a2b7-f7e9c752db04 tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] [instance: 5feadca8-33f0-4dac-8e1d-162c77919c77] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 855.703204] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfd9e726-0337-4709-a40a-7f8ce40ee170 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 855.716917] env[67820]: DEBUG nova.compute.manager [None req-0fed6338-d74f-4e45-a2b7-f7e9c752db04 tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] [instance: 5feadca8-33f0-4dac-8e1d-162c77919c77] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 855.737609] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 573a28e3-bfc4-4b08-919b-65acbca79c7b could not be found. [ 855.737936] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 855.738209] env[67820]: INFO nova.compute.manager [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 855.738577] env[67820]: DEBUG oslo.service.loopingcall [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 855.741047] env[67820]: DEBUG nova.compute.manager [-] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 855.741235] env[67820]: DEBUG nova.network.neutron [-] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 855.755259] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0fed6338-d74f-4e45-a2b7-f7e9c752db04 tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] Lock "5feadca8-33f0-4dac-8e1d-162c77919c77" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.885s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.764116] env[67820]: DEBUG nova.network.neutron [-] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.765960] env[67820]: DEBUG nova.compute.manager [None req-51d4a037-8ff1-436a-babe-812ab4df76e5 tempest-ServerRescueTestJSON-1486015266 tempest-ServerRescueTestJSON-1486015266-project-member] [instance: 2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 855.776029] env[67820]: INFO nova.compute.manager [-] [instance: 573a28e3-bfc4-4b08-919b-65acbca79c7b] Took 0.03 seconds to deallocate network for instance. [ 855.790027] env[67820]: DEBUG nova.compute.manager [None req-51d4a037-8ff1-436a-babe-812ab4df76e5 tempest-ServerRescueTestJSON-1486015266 tempest-ServerRescueTestJSON-1486015266-project-member] [instance: 2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 855.812903] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51d4a037-8ff1-436a-babe-812ab4df76e5 tempest-ServerRescueTestJSON-1486015266 tempest-ServerRescueTestJSON-1486015266-project-member] Lock "2c2c96a0-fd0e-4d48-8c3e-6fe66fc555f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.348s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.835017] env[67820]: DEBUG nova.compute.manager [None req-f301528a-ae39-4cbc-9d5a-f03a736f97fc tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] [instance: da533036-7e32-4078-9060-6ee7680cba5f] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 855.865733] env[67820]: DEBUG nova.compute.manager [None req-f301528a-ae39-4cbc-9d5a-f03a736f97fc tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] [instance: da533036-7e32-4078-9060-6ee7680cba5f] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 855.895120] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f301528a-ae39-4cbc-9d5a-f03a736f97fc tempest-ServersAdminTestJSON-1778720660 tempest-ServersAdminTestJSON-1778720660-project-member] Lock "da533036-7e32-4078-9060-6ee7680cba5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.610s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.907456] env[67820]: DEBUG nova.compute.manager [None req-b752fbe6-46ca-44b6-b052-b77aa50af329 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454-project-member] [instance: 4e356219-fa88-474c-97fe-6f6a6ef0c90d] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 855.909866] env[67820]: DEBUG oslo_concurrency.lockutils [None req-94b38de8-9776-4656-ab6e-4590159ad2fb tempest-ServerExternalEventsTest-1259596175 tempest-ServerExternalEventsTest-1259596175-project-member] Lock "573a28e3-bfc4-4b08-919b-65acbca79c7b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.231s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.933200] env[67820]: DEBUG nova.compute.manager [None req-b752fbe6-46ca-44b6-b052-b77aa50af329 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454-project-member] [instance: 4e356219-fa88-474c-97fe-6f6a6ef0c90d] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 855.955181] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b752fbe6-46ca-44b6-b052-b77aa50af329 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454 tempest-FloatingIPsAssociationNegativeTestJSON-1169027454-project-member] Lock "4e356219-fa88-474c-97fe-6f6a6ef0c90d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.239s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 855.965955] env[67820]: DEBUG nova.compute.manager [None req-e207ec3d-d65a-4446-9f8f-4d954dd4a000 tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] [instance: 454a0392-c614-4ca8-903e-48efa44be22f] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 855.990524] env[67820]: DEBUG nova.compute.manager [None req-e207ec3d-d65a-4446-9f8f-4d954dd4a000 tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] [instance: 454a0392-c614-4ca8-903e-48efa44be22f] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 856.013693] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e207ec3d-d65a-4446-9f8f-4d954dd4a000 tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] Lock "454a0392-c614-4ca8-903e-48efa44be22f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.536s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.023684] env[67820]: DEBUG nova.compute.manager [None req-d90fef5e-1cf7-4482-bbf5-b798e9a0355a tempest-ServerDiagnosticsTest-1880289301 tempest-ServerDiagnosticsTest-1880289301-project-member] [instance: 90319a68-5613-4b18-91d3-b606d258ced9] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 856.049450] env[67820]: DEBUG nova.compute.manager [None req-d90fef5e-1cf7-4482-bbf5-b798e9a0355a tempest-ServerDiagnosticsTest-1880289301 tempest-ServerDiagnosticsTest-1880289301-project-member] [instance: 90319a68-5613-4b18-91d3-b606d258ced9] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 856.072937] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d90fef5e-1cf7-4482-bbf5-b798e9a0355a tempest-ServerDiagnosticsTest-1880289301 tempest-ServerDiagnosticsTest-1880289301-project-member] Lock "90319a68-5613-4b18-91d3-b606d258ced9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.616s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.084956] env[67820]: DEBUG nova.compute.manager [None req-8fa02df1-30a7-4ecb-9dac-48e772013f76 tempest-ServersTestBootFromVolume-725629375 tempest-ServersTestBootFromVolume-725629375-project-member] [instance: b80e4804-6f23-4059-8e9c-bf8ecdc2efc2] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 856.113562] env[67820]: DEBUG nova.compute.manager [None req-8fa02df1-30a7-4ecb-9dac-48e772013f76 tempest-ServersTestBootFromVolume-725629375 tempest-ServersTestBootFromVolume-725629375-project-member] [instance: b80e4804-6f23-4059-8e9c-bf8ecdc2efc2] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 856.145101] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8fa02df1-30a7-4ecb-9dac-48e772013f76 tempest-ServersTestBootFromVolume-725629375 tempest-ServersTestBootFromVolume-725629375-project-member] Lock "b80e4804-6f23-4059-8e9c-bf8ecdc2efc2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.155473] env[67820]: DEBUG nova.compute.manager [None req-f5d236fa-aa9c-424f-be60-c88fc95baa29 tempest-AttachInterfacesUnderV243Test-989850202 tempest-AttachInterfacesUnderV243Test-989850202-project-member] [instance: 8e49f35a-037b-4db4-8bec-005b50905852] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 856.181733] env[67820]: DEBUG nova.compute.manager [None req-f5d236fa-aa9c-424f-be60-c88fc95baa29 tempest-AttachInterfacesUnderV243Test-989850202 tempest-AttachInterfacesUnderV243Test-989850202-project-member] [instance: 8e49f35a-037b-4db4-8bec-005b50905852] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 856.205788] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f5d236fa-aa9c-424f-be60-c88fc95baa29 tempest-AttachInterfacesUnderV243Test-989850202 tempest-AttachInterfacesUnderV243Test-989850202-project-member] Lock "8e49f35a-037b-4db4-8bec-005b50905852" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.985s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.215065] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 856.274810] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 856.275161] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 856.277892] env[67820]: INFO nova.compute.claims [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 856.595968] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a953ef0b-ffbe-4316-a8f6-14f9d6a76f1b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.603634] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1f16063-422e-4f8b-9e1e-ffbe0053de92 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.634214] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ca32abc-beb6-4362-9ab4-0bb8acb2a471 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.642009] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc27b265-faa3-4230-9ac4-ae7695532c5a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.655703] env[67820]: DEBUG nova.compute.provider_tree [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 856.664278] env[67820]: DEBUG nova.scheduler.client.report [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 856.678341] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.403s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 856.678657] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 856.717317] env[67820]: DEBUG nova.compute.utils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 856.718936] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 856.719626] env[67820]: DEBUG nova.network.neutron [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 856.728174] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 856.798965] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 856.812534] env[67820]: DEBUG nova.policy [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '20c0e91a2a46450980bef25b5a373f6a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b141581737a44c5894416bcaa7af709', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 856.838755] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:45:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8f4417da-d004-4256-85e6-ea47f5da576b',id=21,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-346163275',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 856.838755] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 856.838755] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 856.839178] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 856.839549] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 856.839890] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 856.840251] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 856.840553] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 856.840857] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 856.841192] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 856.841520] env[67820]: DEBUG nova.virt.hardware [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 856.842560] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcb9f0c1-6301-4ef4-9e0c-533098d641fd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 856.851883] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f53cf070-771d-4fe8-bfc8-7d0a71fd330d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 857.454988] env[67820]: DEBUG nova.network.neutron [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Successfully created port: e2312bf3-7102-4620-81a5-a0cfccfe6506 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 858.335688] env[67820]: DEBUG nova.compute.manager [req-c9dd1d04-fc92-428c-a0ac-3efb3b0ce072 req-0635c476-9f87-4f0e-a325-38e32aa97cb1 service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Received event network-vif-plugged-e2312bf3-7102-4620-81a5-a0cfccfe6506 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 858.335972] env[67820]: DEBUG oslo_concurrency.lockutils [req-c9dd1d04-fc92-428c-a0ac-3efb3b0ce072 req-0635c476-9f87-4f0e-a325-38e32aa97cb1 service nova] Acquiring lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 858.337034] env[67820]: DEBUG oslo_concurrency.lockutils [req-c9dd1d04-fc92-428c-a0ac-3efb3b0ce072 req-0635c476-9f87-4f0e-a325-38e32aa97cb1 service nova] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 858.337034] env[67820]: DEBUG oslo_concurrency.lockutils [req-c9dd1d04-fc92-428c-a0ac-3efb3b0ce072 req-0635c476-9f87-4f0e-a325-38e32aa97cb1 service nova] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 858.337034] env[67820]: DEBUG nova.compute.manager [req-c9dd1d04-fc92-428c-a0ac-3efb3b0ce072 req-0635c476-9f87-4f0e-a325-38e32aa97cb1 service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] No waiting events found dispatching network-vif-plugged-e2312bf3-7102-4620-81a5-a0cfccfe6506 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 858.337034] env[67820]: WARNING nova.compute.manager [req-c9dd1d04-fc92-428c-a0ac-3efb3b0ce072 req-0635c476-9f87-4f0e-a325-38e32aa97cb1 service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Received unexpected event network-vif-plugged-e2312bf3-7102-4620-81a5-a0cfccfe6506 for instance with vm_state building and task_state spawning. [ 858.384386] env[67820]: DEBUG nova.network.neutron [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Successfully updated port: e2312bf3-7102-4620-81a5-a0cfccfe6506 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 858.400413] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "refresh_cache-2311d6b7-32ab-45c6-83f8-9b341e847bf0" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 858.400557] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired lock "refresh_cache-2311d6b7-32ab-45c6-83f8-9b341e847bf0" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 858.400717] env[67820]: DEBUG nova.network.neutron [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 858.448251] env[67820]: DEBUG nova.network.neutron [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 858.682131] env[67820]: DEBUG nova.network.neutron [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Updating instance_info_cache with network_info: [{"id": "e2312bf3-7102-4620-81a5-a0cfccfe6506", "address": "fa:16:3e:10:6a:96", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2312bf3-71", "ovs_interfaceid": "e2312bf3-7102-4620-81a5-a0cfccfe6506", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 858.697768] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Releasing lock "refresh_cache-2311d6b7-32ab-45c6-83f8-9b341e847bf0" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 858.698105] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Instance network_info: |[{"id": "e2312bf3-7102-4620-81a5-a0cfccfe6506", "address": "fa:16:3e:10:6a:96", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2312bf3-71", "ovs_interfaceid": "e2312bf3-7102-4620-81a5-a0cfccfe6506", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 858.698497] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:10:6a:96', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e2312bf3-7102-4620-81a5-a0cfccfe6506', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 858.706511] env[67820]: DEBUG oslo.service.loopingcall [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 858.707227] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 858.707351] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-90394dd6-91fe-4a34-8264-7f1697254e6f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 858.728961] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 858.728961] env[67820]: value = "task-3467339" [ 858.728961] env[67820]: _type = "Task" [ 858.728961] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 858.741025] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467339, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 859.245297] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467339, 'name': CreateVM_Task, 'duration_secs': 0.280718} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 859.245450] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 859.246191] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 859.246407] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 859.246763] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 859.247081] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3b7f902e-ed03-4aca-80d1-41f71b92c906 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 859.252653] env[67820]: DEBUG oslo_vmware.api [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for the task: (returnval){ [ 859.252653] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]528f199c-f71b-1ba5-500d-abaa33db15e2" [ 859.252653] env[67820]: _type = "Task" [ 859.252653] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 859.261473] env[67820]: DEBUG oslo_vmware.api [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]528f199c-f71b-1ba5-500d-abaa33db15e2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 859.766526] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 859.766526] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 859.766526] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 860.425220] env[67820]: DEBUG nova.compute.manager [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Received event network-changed-e2312bf3-7102-4620-81a5-a0cfccfe6506 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 860.425414] env[67820]: DEBUG nova.compute.manager [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Refreshing instance network info cache due to event network-changed-e2312bf3-7102-4620-81a5-a0cfccfe6506. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 860.425622] env[67820]: DEBUG oslo_concurrency.lockutils [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] Acquiring lock "refresh_cache-2311d6b7-32ab-45c6-83f8-9b341e847bf0" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 860.425757] env[67820]: DEBUG oslo_concurrency.lockutils [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] Acquired lock "refresh_cache-2311d6b7-32ab-45c6-83f8-9b341e847bf0" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 860.425909] env[67820]: DEBUG nova.network.neutron [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Refreshing network info cache for port e2312bf3-7102-4620-81a5-a0cfccfe6506 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 860.794423] env[67820]: DEBUG nova.network.neutron [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Updated VIF entry in instance network info cache for port e2312bf3-7102-4620-81a5-a0cfccfe6506. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 860.794423] env[67820]: DEBUG nova.network.neutron [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Updating instance_info_cache with network_info: [{"id": "e2312bf3-7102-4620-81a5-a0cfccfe6506", "address": "fa:16:3e:10:6a:96", "network": {"id": "4e1ffb99-0a80-45cf-9e68-9506e58a08b0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0c9919c381ed4ae08ec1c6d27ce1eaac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2312bf3-71", "ovs_interfaceid": "e2312bf3-7102-4620-81a5-a0cfccfe6506", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.806013] env[67820]: DEBUG oslo_concurrency.lockutils [req-08b7f947-94fc-4150-9df8-b7270e028344 req-ddda414c-104d-40d8-bbe8-e6cbba50f4cb service nova] Releasing lock "refresh_cache-2311d6b7-32ab-45c6-83f8-9b341e847bf0" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 864.430943] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.945783] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "45a68888-979e-4255-98a0-bcb289f57830" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.946103] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "45a68888-979e-4255-98a0-bcb289f57830" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 876.621683] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.622261] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 879.621540] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 879.621777] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.621739] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.622030] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 881.616565] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 881.644370] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 881.644548] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 881.644548] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 881.665023] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665023] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665023] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665233] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665345] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665473] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665596] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665715] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665833] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.665951] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 881.666081] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 881.666528] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 881.677826] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 881.678062] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 881.678233] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 881.678387] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 881.679457] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d53bc3c5-af49-472b-9773-fb4c520a45dd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.689668] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a7194b3-648d-4ff9-b7ec-3da739ec4550 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.704168] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-922f8fe1-6593-4d9a-b5cb-5c3110809308 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.710458] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d9a82db-843c-4210-b7fb-d043de2bb038 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 881.739523] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180928MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 881.739679] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 881.739894] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 881.811104] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance be35d888-f649-44e4-af23-341b8bfc81f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.811214] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1cc3b207-a628-4fe5-8908-6879483806b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.811555] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 80f480dc-9bb8-4764-9b6b-793c0954962e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.811735] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.811869] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.811992] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.812143] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.812266] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.812381] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.812493] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 881.823904] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.835493] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.846025] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 49515d53-7359-4bc6-8a26-95ffd4fe4ed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.856308] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.866206] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 22905445-5120-4cdd-b965-099001e4147c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.878019] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.889033] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 53757aa2-7013-42a5-94bd-e831c8f08c40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.899431] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 76ca6882-c1e0-4ae0-af8d-7d5673a13540 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.911914] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f77a97a9-3c8c-4484-8948-dd6dc9dc4077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.920993] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 128f3466-5304-44dc-a569-2ba894f5333c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.931851] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 25a4642e-155a-473f-953c-b0fedbb6eac0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.943211] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8933061f-5489-447c-80ca-28d4a427349e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.956007] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d46bcc6e-6c3a-4200-a7f3-4c571ba6d819 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.966385] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d962be83-2769-465b-8628-1c1b656b6830 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.976389] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 881.976626] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 881.976787] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 882.261161] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69efb70c-aee5-488a-a6a1-880d5221c03f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.268696] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b4df91-26c4-4725-bd77-717c51cb1e36 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.297482] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-272baa16-4fc3-4bb6-b79d-0e53a5795d82 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.304164] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a4e79a0-37c7-4494-87bf-22dd0df90b82 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.316636] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 882.324899] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 882.339695] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 882.339894] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 883.295829] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 883.616360] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 902.952908] env[67820]: WARNING oslo_vmware.rw_handles [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 902.952908] env[67820]: ERROR oslo_vmware.rw_handles [ 902.953516] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 902.955275] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 902.955519] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Copying Virtual Disk [datastore1] vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/e03a5f2f-e003-4d33-817b-2dc53a900b13/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 902.955801] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8e3f551d-c14a-4ec6-aea3-7e5edf5efff8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 902.963425] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Waiting for the task: (returnval){ [ 902.963425] env[67820]: value = "task-3467340" [ 902.963425] env[67820]: _type = "Task" [ 902.963425] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 902.971292] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Task: {'id': task-3467340, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 903.474561] env[67820]: DEBUG oslo_vmware.exceptions [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 903.474823] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 903.475425] env[67820]: ERROR nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 903.475425] env[67820]: Faults: ['InvalidArgument'] [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Traceback (most recent call last): [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] yield resources [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self.driver.spawn(context, instance, image_meta, [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self._fetch_image_if_missing(context, vi) [ 903.475425] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] image_cache(vi, tmp_image_ds_loc) [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] vm_util.copy_virtual_disk( [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] session._wait_for_task(vmdk_copy_task) [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] return self.wait_for_task(task_ref) [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] return evt.wait() [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] result = hub.switch() [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 903.475852] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] return self.greenlet.switch() [ 903.476237] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 903.476237] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self.f(*self.args, **self.kw) [ 903.476237] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 903.476237] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] raise exceptions.translate_fault(task_info.error) [ 903.476237] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 903.476237] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Faults: ['InvalidArgument'] [ 903.476237] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] [ 903.476237] env[67820]: INFO nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Terminating instance [ 903.477601] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 903.477601] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 903.477601] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c6c3a08b-ca4a-459c-9974-473766beee45 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.479768] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 903.479894] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 903.480715] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0228f40-41e8-4da4-8722-63d58a74e744 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.487342] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 903.487556] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-719efca2-2f1b-4d2d-a9fc-381727a91713 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.489797] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 903.489994] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 903.490903] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0eb87008-2705-42d3-a05e-b34061c24e13 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.495439] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Waiting for the task: (returnval){ [ 903.495439] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52e5f0e8-0928-d694-d94f-5a832c5102d6" [ 903.495439] env[67820]: _type = "Task" [ 903.495439] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 903.502715] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52e5f0e8-0928-d694-d94f-5a832c5102d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 903.551424] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 903.551637] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 903.551822] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Deleting the datastore file [datastore1] be35d888-f649-44e4-af23-341b8bfc81f6 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 903.552238] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-984a5397-e2b2-4452-8efa-962cf36d791f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.559117] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Waiting for the task: (returnval){ [ 903.559117] env[67820]: value = "task-3467342" [ 903.559117] env[67820]: _type = "Task" [ 903.559117] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 903.566873] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Task: {'id': task-3467342, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 904.005155] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 904.005421] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Creating directory with path [datastore1] vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 904.005647] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e592c49f-3b05-46dc-b5fb-fb58a5c46494 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.017185] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Created directory with path [datastore1] vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 904.017379] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Fetch image to [datastore1] vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 904.017547] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 904.018289] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56d0f864-4935-4440-b536-eb8f9cc0fff7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.024435] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bdb17e2-a0c2-44f1-a866-518063b1d8e9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.033129] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-927bae3b-0ab8-4252-860f-d04767094dbb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.066254] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3961a6a3-e2e3-4cac-93aa-3594f05ec5c7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.072729] env[67820]: DEBUG oslo_vmware.api [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Task: {'id': task-3467342, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076804} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 904.074083] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 904.074270] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 904.074435] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 904.074604] env[67820]: INFO nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Took 0.59 seconds to destroy the instance on the hypervisor. [ 904.076284] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1c6be993-3d3d-489d-8711-37e0ed130fb3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.078059] env[67820]: DEBUG nova.compute.claims [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 904.078232] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 904.078444] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 904.110788] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 904.236431] env[67820]: DEBUG oslo_vmware.rw_handles [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 904.296354] env[67820]: DEBUG oslo_vmware.rw_handles [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 904.296354] env[67820]: DEBUG oslo_vmware.rw_handles [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 904.462235] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12e1b243-d808-4309-9f6e-dd8f27196959 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.470087] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc320d57-668c-4a2c-ba35-20af0da17e06 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.499178] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d83e4cb6-1188-46cd-b6a8-20b30a217723 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.506163] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-926bd541-3438-46b7-893d-b08526d0494c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.519089] env[67820]: DEBUG nova.compute.provider_tree [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 904.528030] env[67820]: DEBUG nova.scheduler.client.report [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 904.541406] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.463s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 904.541987] env[67820]: ERROR nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 904.541987] env[67820]: Faults: ['InvalidArgument'] [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Traceback (most recent call last): [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self.driver.spawn(context, instance, image_meta, [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self._fetch_image_if_missing(context, vi) [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] image_cache(vi, tmp_image_ds_loc) [ 904.541987] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] vm_util.copy_virtual_disk( [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] session._wait_for_task(vmdk_copy_task) [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] return self.wait_for_task(task_ref) [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] return evt.wait() [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] result = hub.switch() [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] return self.greenlet.switch() [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 904.542414] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] self.f(*self.args, **self.kw) [ 904.542844] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 904.542844] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] raise exceptions.translate_fault(task_info.error) [ 904.542844] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 904.542844] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Faults: ['InvalidArgument'] [ 904.542844] env[67820]: ERROR nova.compute.manager [instance: be35d888-f649-44e4-af23-341b8bfc81f6] [ 904.542844] env[67820]: DEBUG nova.compute.utils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 904.544069] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Build of instance be35d888-f649-44e4-af23-341b8bfc81f6 was re-scheduled: A specified parameter was not correct: fileType [ 904.544069] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 904.544436] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 904.544606] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 904.544760] env[67820]: DEBUG nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 904.544920] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 904.919958] env[67820]: DEBUG nova.network.neutron [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 904.932970] env[67820]: INFO nova.compute.manager [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Took 0.39 seconds to deallocate network for instance. [ 905.036106] env[67820]: INFO nova.scheduler.client.report [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Deleted allocations for instance be35d888-f649-44e4-af23-341b8bfc81f6 [ 905.059401] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ad6a746e-09c9-4e06-95c9-95d0045eca21 tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "be35d888-f649-44e4-af23-341b8bfc81f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 338.218s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 905.061280] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "be35d888-f649-44e4-af23-341b8bfc81f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 137.769s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 905.061280] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Acquiring lock "be35d888-f649-44e4-af23-341b8bfc81f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 905.061495] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "be35d888-f649-44e4-af23-341b8bfc81f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 905.061550] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "be35d888-f649-44e4-af23-341b8bfc81f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 905.063440] env[67820]: INFO nova.compute.manager [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Terminating instance [ 905.067992] env[67820]: DEBUG nova.compute.manager [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 905.067992] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 905.068451] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d836a572-5c36-46cd-aa0b-daf86aa08c45 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.078018] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0db3c479-ccb2-4489-bc07-d25fee838ac4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.089553] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 905.110440] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance be35d888-f649-44e4-af23-341b8bfc81f6 could not be found. [ 905.110702] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 905.110982] env[67820]: INFO nova.compute.manager [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 905.111251] env[67820]: DEBUG oslo.service.loopingcall [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 905.111474] env[67820]: DEBUG nova.compute.manager [-] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 905.111726] env[67820]: DEBUG nova.network.neutron [-] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 905.146855] env[67820]: DEBUG nova.network.neutron [-] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 905.154929] env[67820]: INFO nova.compute.manager [-] [instance: be35d888-f649-44e4-af23-341b8bfc81f6] Took 0.04 seconds to deallocate network for instance. [ 905.171589] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 905.172189] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 905.173356] env[67820]: INFO nova.compute.claims [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 905.257926] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d469afb2-9b64-4b4d-973d-239c5656dc1e tempest-TenantUsagesTestJSON-426545680 tempest-TenantUsagesTestJSON-426545680-project-member] Lock "be35d888-f649-44e4-af23-341b8bfc81f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.197s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 905.519342] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9132e8ca-1572-4d9d-8ea7-e8b54d395115 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.527038] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e498a0a0-8fb1-41c8-98e6-61234575b194 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.558087] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92a039bc-596d-47dc-ae9a-bb59f6cb6877 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.565038] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce70320-b4f2-46f7-8f1c-8e76016f7165 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.577831] env[67820]: DEBUG nova.compute.provider_tree [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 905.586421] env[67820]: DEBUG nova.scheduler.client.report [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 905.600685] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.429s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 905.601157] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 905.633489] env[67820]: DEBUG nova.compute.utils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 905.634637] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 905.634822] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 905.643893] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 905.694667] env[67820]: DEBUG nova.policy [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd81a07fc3b4c470c8f5a087d8825c5df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '890ffca423414cd69eca6a6bf4d1ac66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 905.730402] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 905.760253] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 905.760600] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 905.760770] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 905.760953] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 905.761140] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 905.761292] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 905.761499] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 905.761715] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 905.761842] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 905.762034] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 905.762149] env[67820]: DEBUG nova.virt.hardware [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 905.763049] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c53a31fb-d2e3-454e-b519-848737fe1e42 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 905.771427] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2846417b-2661-4bbf-87da-392b97f8f6fb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 906.105293] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Successfully created port: e830bd17-5fdb-446d-9fb2-aaf1e0067b31 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 906.948286] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Successfully updated port: e830bd17-5fdb-446d-9fb2-aaf1e0067b31 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 906.961649] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "refresh_cache-d06b6984-d1d4-4afd-8ffd-f37407697d4b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 906.961995] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "refresh_cache-d06b6984-d1d4-4afd-8ffd-f37407697d4b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 906.961995] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 907.024159] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 907.203207] env[67820]: DEBUG nova.compute.manager [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Received event network-vif-plugged-e830bd17-5fdb-446d-9fb2-aaf1e0067b31 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 907.203207] env[67820]: DEBUG oslo_concurrency.lockutils [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] Acquiring lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 907.203207] env[67820]: DEBUG oslo_concurrency.lockutils [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 907.203207] env[67820]: DEBUG oslo_concurrency.lockutils [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 907.203605] env[67820]: DEBUG nova.compute.manager [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] No waiting events found dispatching network-vif-plugged-e830bd17-5fdb-446d-9fb2-aaf1e0067b31 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 907.203605] env[67820]: WARNING nova.compute.manager [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Received unexpected event network-vif-plugged-e830bd17-5fdb-446d-9fb2-aaf1e0067b31 for instance with vm_state building and task_state spawning. [ 907.203685] env[67820]: DEBUG nova.compute.manager [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Received event network-changed-e830bd17-5fdb-446d-9fb2-aaf1e0067b31 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 907.203782] env[67820]: DEBUG nova.compute.manager [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Refreshing instance network info cache due to event network-changed-e830bd17-5fdb-446d-9fb2-aaf1e0067b31. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 907.203942] env[67820]: DEBUG oslo_concurrency.lockutils [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] Acquiring lock "refresh_cache-d06b6984-d1d4-4afd-8ffd-f37407697d4b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 907.502206] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Updating instance_info_cache with network_info: [{"id": "e830bd17-5fdb-446d-9fb2-aaf1e0067b31", "address": "fa:16:3e:1e:58:42", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape830bd17-5f", "ovs_interfaceid": "e830bd17-5fdb-446d-9fb2-aaf1e0067b31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.524650] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "refresh_cache-d06b6984-d1d4-4afd-8ffd-f37407697d4b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 907.524972] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Instance network_info: |[{"id": "e830bd17-5fdb-446d-9fb2-aaf1e0067b31", "address": "fa:16:3e:1e:58:42", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape830bd17-5f", "ovs_interfaceid": "e830bd17-5fdb-446d-9fb2-aaf1e0067b31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 907.525288] env[67820]: DEBUG oslo_concurrency.lockutils [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] Acquired lock "refresh_cache-d06b6984-d1d4-4afd-8ffd-f37407697d4b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 907.525470] env[67820]: DEBUG nova.network.neutron [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Refreshing network info cache for port e830bd17-5fdb-446d-9fb2-aaf1e0067b31 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 907.526549] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1e:58:42', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db00ec2e-3155-46b6-8170-082f7d86dbe7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e830bd17-5fdb-446d-9fb2-aaf1e0067b31', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 907.534925] env[67820]: DEBUG oslo.service.loopingcall [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 907.537867] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 907.538729] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b76e44b1-3fbe-42d4-82f2-ce998e70a6ed {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 907.562961] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 907.562961] env[67820]: value = "task-3467343" [ 907.562961] env[67820]: _type = "Task" [ 907.562961] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 907.571875] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467343, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 907.927510] env[67820]: DEBUG nova.network.neutron [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Updated VIF entry in instance network info cache for port e830bd17-5fdb-446d-9fb2-aaf1e0067b31. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 907.927929] env[67820]: DEBUG nova.network.neutron [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Updating instance_info_cache with network_info: [{"id": "e830bd17-5fdb-446d-9fb2-aaf1e0067b31", "address": "fa:16:3e:1e:58:42", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape830bd17-5f", "ovs_interfaceid": "e830bd17-5fdb-446d-9fb2-aaf1e0067b31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.939063] env[67820]: DEBUG oslo_concurrency.lockutils [req-71aa1b20-16e6-46cb-827d-30bcb61afd98 req-741a7616-51b6-4bf2-8254-ca729b641618 service nova] Releasing lock "refresh_cache-d06b6984-d1d4-4afd-8ffd-f37407697d4b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 908.074802] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467343, 'name': CreateVM_Task, 'duration_secs': 0.298365} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 908.074802] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 908.075358] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 908.075540] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 908.075867] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 908.076222] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8a799477-609c-41e4-820d-06dd8c52ffc3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 908.085902] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 908.085902] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52391887-aa86-22fb-aec7-8fbf81b28020" [ 908.085902] env[67820]: _type = "Task" [ 908.085902] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 908.095206] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52391887-aa86-22fb-aec7-8fbf81b28020, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 908.596427] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 908.600074] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 908.600074] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 910.219934] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 911.087649] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 911.087880] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.622862] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 939.622249] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.622051] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.622318] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.622561] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.622631] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 942.622627] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 942.622910] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 942.622941] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 942.645332] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.645518] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.645689] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.645828] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.645954] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.646093] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.646256] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.646394] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.646512] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.646629] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 942.646750] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 943.620867] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 943.621141] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 943.633629] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 943.633883] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 943.634020] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 943.634181] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 943.635307] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ac4d5a0-30b8-4847-b50b-ddcfe529a0db {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.644289] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2aa9e72-a68f-4b96-bd3c-81840debbfb7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.658585] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da8005f0-a3d6-4573-ba39-8f64a2c17a72 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.665122] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b7511d1-c3c6-43ea-a573-322fdce5e917 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 943.693966] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180896MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 943.694142] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 943.694340] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 943.770204] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1cc3b207-a628-4fe5-8908-6879483806b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.770399] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 80f480dc-9bb8-4764-9b6b-793c0954962e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.770536] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.770659] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.770780] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.770897] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.771015] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.771139] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.771271] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.771414] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 943.782632] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.792926] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 49515d53-7359-4bc6-8a26-95ffd4fe4ed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.802967] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.813802] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 22905445-5120-4cdd-b965-099001e4147c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.823902] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.833695] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 53757aa2-7013-42a5-94bd-e831c8f08c40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.843886] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 76ca6882-c1e0-4ae0-af8d-7d5673a13540 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.853361] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f77a97a9-3c8c-4484-8948-dd6dc9dc4077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.863920] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 128f3466-5304-44dc-a569-2ba894f5333c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.873092] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 25a4642e-155a-473f-953c-b0fedbb6eac0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.882312] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8933061f-5489-447c-80ca-28d4a427349e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.891906] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d46bcc6e-6c3a-4200-a7f3-4c571ba6d819 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.900814] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d962be83-2769-465b-8628-1c1b656b6830 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.909908] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.919134] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 943.919365] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 943.919511] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 944.209656] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b169e235-d70d-4f9a-bb0c-bc547d1ceaf1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 944.217318] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14d57843-9f47-42bf-b38a-8fcddfbf2611 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 944.247629] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9eb6dd1-336c-4d33-94c8-71b63d5d7b88 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 944.254735] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15f62a58-1c6d-4223-8286-fabe2ce4cdbc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 944.267806] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 944.277125] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 944.292246] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 944.292432] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.598s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 945.293572] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 949.749048] env[67820]: WARNING oslo_vmware.rw_handles [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 949.749048] env[67820]: ERROR oslo_vmware.rw_handles [ 949.749048] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 949.750739] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 949.751021] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Copying Virtual Disk [datastore1] vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/3722c7ba-cede-463b-97c7-189580d8abe1/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 949.751310] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c0d82b9d-6260-4422-965b-d4993364defe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.761914] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Waiting for the task: (returnval){ [ 949.761914] env[67820]: value = "task-3467344" [ 949.761914] env[67820]: _type = "Task" [ 949.761914] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 949.769591] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Task: {'id': task-3467344, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 950.275214] env[67820]: DEBUG oslo_vmware.exceptions [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 950.275624] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 950.276189] env[67820]: ERROR nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 950.276189] env[67820]: Faults: ['InvalidArgument'] [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Traceback (most recent call last): [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] yield resources [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self.driver.spawn(context, instance, image_meta, [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self._fetch_image_if_missing(context, vi) [ 950.276189] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] image_cache(vi, tmp_image_ds_loc) [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] vm_util.copy_virtual_disk( [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] session._wait_for_task(vmdk_copy_task) [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] return self.wait_for_task(task_ref) [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] return evt.wait() [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] result = hub.switch() [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 950.276558] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] return self.greenlet.switch() [ 950.276901] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 950.276901] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self.f(*self.args, **self.kw) [ 950.276901] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 950.276901] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] raise exceptions.translate_fault(task_info.error) [ 950.276901] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 950.276901] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Faults: ['InvalidArgument'] [ 950.276901] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] [ 950.276901] env[67820]: INFO nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Terminating instance [ 950.278326] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 950.278401] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 950.278587] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7a1de50a-8bd1-432c-b4b2-7c4cb72c401d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.280870] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 950.281135] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 950.282118] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-912afdf5-c19d-4365-8a34-bca5a4a5fb00 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.289452] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 950.289742] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6cc1525f-3d9b-46a9-aaa9-81de94be8430 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.291983] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 950.292172] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 950.293282] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9b07ae4a-942d-453c-8a70-481ff29dd671 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.298243] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Waiting for the task: (returnval){ [ 950.298243] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5231f61e-2b37-e602-c030-83ebe0db3224" [ 950.298243] env[67820]: _type = "Task" [ 950.298243] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 950.305385] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5231f61e-2b37-e602-c030-83ebe0db3224, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 950.357228] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 950.357505] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 950.357720] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Deleting the datastore file [datastore1] 1cc3b207-a628-4fe5-8908-6879483806b9 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 950.358021] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5d1c5bf1-2ffb-4e41-8456-623e50f0e69f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.364234] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Waiting for the task: (returnval){ [ 950.364234] env[67820]: value = "task-3467346" [ 950.364234] env[67820]: _type = "Task" [ 950.364234] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 950.372255] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Task: {'id': task-3467346, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 950.808829] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 950.809109] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Creating directory with path [datastore1] vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 950.809350] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3e67f304-b7a9-49d5-87fa-9607398b8177 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.821572] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Created directory with path [datastore1] vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 950.821572] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Fetch image to [datastore1] vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 950.821744] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 950.822479] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1736d18d-6144-4b3a-a144-e11158d4db3c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.829223] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b655bcdd-3478-4fdb-bb9d-b6e6a7ce61c9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.838437] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2820366-7793-4249-b564-69b464adeddc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.872527] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2d7b22e-6d79-4f9a-bba9-b2e52fb37baf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.881143] env[67820]: DEBUG oslo_vmware.api [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Task: {'id': task-3467346, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068286} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 950.882640] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 950.882833] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 950.883012] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 950.883197] env[67820]: INFO nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 950.884962] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c9dd89d1-39d4-47cb-aa85-fd0c0c4da80a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.887886] env[67820]: DEBUG nova.compute.claims [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 950.887886] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 950.887886] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 950.908152] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 950.966370] env[67820]: DEBUG oslo_vmware.rw_handles [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 951.027460] env[67820]: DEBUG oslo_vmware.rw_handles [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 951.027693] env[67820]: DEBUG oslo_vmware.rw_handles [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 951.240802] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-743bdfff-33ba-45a3-a189-a86ee9ea9959 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.248449] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf822be-e88d-4e06-ad47-83aed2fa4902 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.277400] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9e5020a-0ead-4a07-9f57-f634644b47b4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.284520] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-813365f3-13df-4b7a-b4e5-91b13e181fb3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.297149] env[67820]: DEBUG nova.compute.provider_tree [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 951.305552] env[67820]: DEBUG nova.scheduler.client.report [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 951.320283] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.433s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 951.320854] env[67820]: ERROR nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 951.320854] env[67820]: Faults: ['InvalidArgument'] [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Traceback (most recent call last): [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self.driver.spawn(context, instance, image_meta, [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self._fetch_image_if_missing(context, vi) [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] image_cache(vi, tmp_image_ds_loc) [ 951.320854] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] vm_util.copy_virtual_disk( [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] session._wait_for_task(vmdk_copy_task) [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] return self.wait_for_task(task_ref) [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] return evt.wait() [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] result = hub.switch() [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] return self.greenlet.switch() [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 951.321297] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] self.f(*self.args, **self.kw) [ 951.321797] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 951.321797] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] raise exceptions.translate_fault(task_info.error) [ 951.321797] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 951.321797] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Faults: ['InvalidArgument'] [ 951.321797] env[67820]: ERROR nova.compute.manager [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] [ 951.321797] env[67820]: DEBUG nova.compute.utils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 951.325679] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Build of instance 1cc3b207-a628-4fe5-8908-6879483806b9 was re-scheduled: A specified parameter was not correct: fileType [ 951.325679] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 951.326074] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 951.326246] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 951.326416] env[67820]: DEBUG nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 951.326579] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 951.743178] env[67820]: DEBUG nova.network.neutron [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.756491] env[67820]: INFO nova.compute.manager [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Took 0.43 seconds to deallocate network for instance. [ 951.859505] env[67820]: INFO nova.scheduler.client.report [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Deleted allocations for instance 1cc3b207-a628-4fe5-8908-6879483806b9 [ 951.879090] env[67820]: DEBUG oslo_concurrency.lockutils [None req-69d59b03-2db8-498b-a897-9fda35426ae5 tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "1cc3b207-a628-4fe5-8908-6879483806b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 378.375s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 951.880682] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "1cc3b207-a628-4fe5-8908-6879483806b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 178.077s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.880910] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Acquiring lock "1cc3b207-a628-4fe5-8908-6879483806b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 951.881132] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "1cc3b207-a628-4fe5-8908-6879483806b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.881300] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "1cc3b207-a628-4fe5-8908-6879483806b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 951.883696] env[67820]: INFO nova.compute.manager [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Terminating instance [ 951.885271] env[67820]: DEBUG nova.compute.manager [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 951.885630] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 951.885937] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b2188c0b-b479-4e7b-af81-fb3ff45f4adf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.895691] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c136fd-ca53-446d-bfd8-308a7b95c41a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.907236] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 951.928294] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1cc3b207-a628-4fe5-8908-6879483806b9 could not be found. [ 951.928494] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 951.928702] env[67820]: INFO nova.compute.manager [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 951.928948] env[67820]: DEBUG oslo.service.loopingcall [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 951.929188] env[67820]: DEBUG nova.compute.manager [-] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 951.929286] env[67820]: DEBUG nova.network.neutron [-] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 951.956566] env[67820]: DEBUG nova.network.neutron [-] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.961759] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 951.961815] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.963928] env[67820]: INFO nova.compute.claims [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 951.967206] env[67820]: INFO nova.compute.manager [-] [instance: 1cc3b207-a628-4fe5-8908-6879483806b9] Took 0.04 seconds to deallocate network for instance. [ 952.065614] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c7680b1e-f2ef-46a1-9a84-6b5a7379459f tempest-ImagesOneServerNegativeTestJSON-1536449618 tempest-ImagesOneServerNegativeTestJSON-1536449618-project-member] Lock "1cc3b207-a628-4fe5-8908-6879483806b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.313110] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3499940-8580-4b40-90c0-adb77e096f32 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.320372] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11796615-e300-48fc-82d1-37531b19a215 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.352319] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e333fce1-81f6-4550-aba7-7e1a692646f8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.359246] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f15025-1a0c-4407-ac0a-bd245619b16e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.371969] env[67820]: DEBUG nova.compute.provider_tree [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 952.381368] env[67820]: DEBUG nova.scheduler.client.report [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 952.395324] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.433s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.395792] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 952.431250] env[67820]: DEBUG nova.compute.utils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 952.432601] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 952.432813] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 952.442923] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 952.503839] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 952.513693] env[67820]: DEBUG nova.policy [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aac589c6933248f4931af9ebf3dbbde9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c6acf4fc89fa4b6391c4029070ea2773', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 952.529055] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 952.529338] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 952.529518] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 952.529621] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 952.529767] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 952.529910] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 952.530137] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 952.530293] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 952.530488] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 952.530660] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 952.530833] env[67820]: DEBUG nova.virt.hardware [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 952.531981] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e099328a-e9d2-4ba9-8877-c4dd94a7c456 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.539646] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a2fb340-c775-4e17-a5a8-2fe47ea6b83a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.978287] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Successfully created port: 8a78fb04-e2b9-48d2-ae15-17b5db6799e4 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 953.626526] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Successfully created port: 54c4a29d-3779-4195-9d6d-69e7a8af9654 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 954.089820] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Successfully created port: 06ebaf73-6499-4b3f-b2b7-aaed764bbd75 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 954.931171] env[67820]: DEBUG nova.compute.manager [req-02ac7049-6214-4adb-9e79-70b9380297b2 req-d96730f8-7898-4bf8-9330-b8c4dbc338c0 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received event network-vif-plugged-8a78fb04-e2b9-48d2-ae15-17b5db6799e4 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 954.931171] env[67820]: DEBUG oslo_concurrency.lockutils [req-02ac7049-6214-4adb-9e79-70b9380297b2 req-d96730f8-7898-4bf8-9330-b8c4dbc338c0 service nova] Acquiring lock "31ec9cab-abfb-4a73-8df8-057670201267-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 954.931171] env[67820]: DEBUG oslo_concurrency.lockutils [req-02ac7049-6214-4adb-9e79-70b9380297b2 req-d96730f8-7898-4bf8-9330-b8c4dbc338c0 service nova] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 954.931171] env[67820]: DEBUG oslo_concurrency.lockutils [req-02ac7049-6214-4adb-9e79-70b9380297b2 req-d96730f8-7898-4bf8-9330-b8c4dbc338c0 service nova] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 954.931285] env[67820]: DEBUG nova.compute.manager [req-02ac7049-6214-4adb-9e79-70b9380297b2 req-d96730f8-7898-4bf8-9330-b8c4dbc338c0 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] No waiting events found dispatching network-vif-plugged-8a78fb04-e2b9-48d2-ae15-17b5db6799e4 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 954.931285] env[67820]: WARNING nova.compute.manager [req-02ac7049-6214-4adb-9e79-70b9380297b2 req-d96730f8-7898-4bf8-9330-b8c4dbc338c0 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received unexpected event network-vif-plugged-8a78fb04-e2b9-48d2-ae15-17b5db6799e4 for instance with vm_state building and task_state spawning. [ 955.060759] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Successfully updated port: 8a78fb04-e2b9-48d2-ae15-17b5db6799e4 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 956.235911] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Successfully updated port: 54c4a29d-3779-4195-9d6d-69e7a8af9654 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 956.976385] env[67820]: DEBUG nova.compute.manager [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received event network-changed-8a78fb04-e2b9-48d2-ae15-17b5db6799e4 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 956.976385] env[67820]: DEBUG nova.compute.manager [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Refreshing instance network info cache due to event network-changed-8a78fb04-e2b9-48d2-ae15-17b5db6799e4. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 956.976505] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Acquiring lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 956.976633] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Acquired lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 956.977795] env[67820]: DEBUG nova.network.neutron [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Refreshing network info cache for port 8a78fb04-e2b9-48d2-ae15-17b5db6799e4 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 957.029940] env[67820]: DEBUG nova.network.neutron [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 957.216210] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Successfully updated port: 06ebaf73-6499-4b3f-b2b7-aaed764bbd75 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 957.229719] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 957.333524] env[67820]: DEBUG nova.network.neutron [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 957.345046] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Releasing lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 957.345751] env[67820]: DEBUG nova.compute.manager [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received event network-vif-plugged-54c4a29d-3779-4195-9d6d-69e7a8af9654 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 957.345751] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Acquiring lock "31ec9cab-abfb-4a73-8df8-057670201267-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 957.345751] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 957.345910] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 957.346104] env[67820]: DEBUG nova.compute.manager [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] No waiting events found dispatching network-vif-plugged-54c4a29d-3779-4195-9d6d-69e7a8af9654 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 957.346169] env[67820]: WARNING nova.compute.manager [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received unexpected event network-vif-plugged-54c4a29d-3779-4195-9d6d-69e7a8af9654 for instance with vm_state building and task_state spawning. [ 957.346488] env[67820]: DEBUG nova.compute.manager [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received event network-changed-54c4a29d-3779-4195-9d6d-69e7a8af9654 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 957.346488] env[67820]: DEBUG nova.compute.manager [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Refreshing instance network info cache due to event network-changed-54c4a29d-3779-4195-9d6d-69e7a8af9654. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 957.346612] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Acquiring lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 957.346746] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Acquired lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 957.346892] env[67820]: DEBUG nova.network.neutron [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Refreshing network info cache for port 54c4a29d-3779-4195-9d6d-69e7a8af9654 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 957.396512] env[67820]: DEBUG nova.network.neutron [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 957.658988] env[67820]: DEBUG nova.network.neutron [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 957.671714] env[67820]: DEBUG oslo_concurrency.lockutils [req-832c8fad-7272-4e4f-8249-773855d31f4d req-23e047e4-6b13-4808-b9b7-fae05f5fcdb7 service nova] Releasing lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 957.671714] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 957.671714] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 957.752986] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 958.866241] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Updating instance_info_cache with network_info: [{"id": "8a78fb04-e2b9-48d2-ae15-17b5db6799e4", "address": "fa:16:3e:26:a3:cb", "network": {"id": "b77c10f1-cd4d-4f04-9b29-8ef0bb6568f1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-653284616", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0467beaa-08c6-44d6-b8a2-e9c609c21ff4", "external-id": "nsx-vlan-transportzone-540", "segmentation_id": 540, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a78fb04-e2", "ovs_interfaceid": "8a78fb04-e2b9-48d2-ae15-17b5db6799e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "54c4a29d-3779-4195-9d6d-69e7a8af9654", "address": "fa:16:3e:55:bb:f5", "network": {"id": "857bbc50-7975-41f5-95ab-f45a8b004187", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1676850120", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54c4a29d-37", "ovs_interfaceid": "54c4a29d-3779-4195-9d6d-69e7a8af9654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "06ebaf73-6499-4b3f-b2b7-aaed764bbd75", "address": "fa:16:3e:9b:5f:65", "network": {"id": "b77c10f1-cd4d-4f04-9b29-8ef0bb6568f1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-653284616", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.41", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0467beaa-08c6-44d6-b8a2-e9c609c21ff4", "external-id": "nsx-vlan-transportzone-540", "segmentation_id": 540, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap06ebaf73-64", "ovs_interfaceid": "06ebaf73-6499-4b3f-b2b7-aaed764bbd75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 958.883060] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Releasing lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 958.883467] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance network_info: |[{"id": "8a78fb04-e2b9-48d2-ae15-17b5db6799e4", "address": "fa:16:3e:26:a3:cb", "network": {"id": "b77c10f1-cd4d-4f04-9b29-8ef0bb6568f1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-653284616", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0467beaa-08c6-44d6-b8a2-e9c609c21ff4", "external-id": "nsx-vlan-transportzone-540", "segmentation_id": 540, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a78fb04-e2", "ovs_interfaceid": "8a78fb04-e2b9-48d2-ae15-17b5db6799e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "54c4a29d-3779-4195-9d6d-69e7a8af9654", "address": "fa:16:3e:55:bb:f5", "network": {"id": "857bbc50-7975-41f5-95ab-f45a8b004187", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1676850120", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54c4a29d-37", "ovs_interfaceid": "54c4a29d-3779-4195-9d6d-69e7a8af9654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "06ebaf73-6499-4b3f-b2b7-aaed764bbd75", "address": "fa:16:3e:9b:5f:65", "network": {"id": "b77c10f1-cd4d-4f04-9b29-8ef0bb6568f1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-653284616", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.41", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0467beaa-08c6-44d6-b8a2-e9c609c21ff4", "external-id": "nsx-vlan-transportzone-540", "segmentation_id": 540, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap06ebaf73-64", "ovs_interfaceid": "06ebaf73-6499-4b3f-b2b7-aaed764bbd75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 958.884178] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:26:a3:cb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0467beaa-08c6-44d6-b8a2-e9c609c21ff4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8a78fb04-e2b9-48d2-ae15-17b5db6799e4', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:55:bb:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '51876cd6-d373-4edc-8595-254e5d631378', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '54c4a29d-3779-4195-9d6d-69e7a8af9654', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:5f:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0467beaa-08c6-44d6-b8a2-e9c609c21ff4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '06ebaf73-6499-4b3f-b2b7-aaed764bbd75', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 958.895981] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Creating folder: Project (c6acf4fc89fa4b6391c4029070ea2773). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 958.896920] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-91161dba-9c93-4f29-b189-9160ba75637b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.907846] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Created folder: Project (c6acf4fc89fa4b6391c4029070ea2773) in parent group-v692668. [ 958.908099] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Creating folder: Instances. Parent ref: group-v692716. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 958.908349] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5bdd7411-93d1-434f-9e12-c7a6274932af {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.918085] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Created folder: Instances in parent group-v692716. [ 958.918327] env[67820]: DEBUG oslo.service.loopingcall [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 958.918517] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 958.918713] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-25ef1916-4e43-4d26-8430-05aa229a1695 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 958.943310] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 958.943310] env[67820]: value = "task-3467349" [ 958.943310] env[67820]: _type = "Task" [ 958.943310] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 958.950666] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467349, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 959.051326] env[67820]: DEBUG nova.compute.manager [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received event network-vif-plugged-06ebaf73-6499-4b3f-b2b7-aaed764bbd75 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 959.051601] env[67820]: DEBUG oslo_concurrency.lockutils [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] Acquiring lock "31ec9cab-abfb-4a73-8df8-057670201267-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 959.051795] env[67820]: DEBUG oslo_concurrency.lockutils [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 959.051963] env[67820]: DEBUG oslo_concurrency.lockutils [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 959.052541] env[67820]: DEBUG nova.compute.manager [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] No waiting events found dispatching network-vif-plugged-06ebaf73-6499-4b3f-b2b7-aaed764bbd75 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 959.052804] env[67820]: WARNING nova.compute.manager [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received unexpected event network-vif-plugged-06ebaf73-6499-4b3f-b2b7-aaed764bbd75 for instance with vm_state building and task_state spawning. [ 959.052985] env[67820]: DEBUG nova.compute.manager [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Received event network-changed-06ebaf73-6499-4b3f-b2b7-aaed764bbd75 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 959.053153] env[67820]: DEBUG nova.compute.manager [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Refreshing instance network info cache due to event network-changed-06ebaf73-6499-4b3f-b2b7-aaed764bbd75. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 959.053338] env[67820]: DEBUG oslo_concurrency.lockutils [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] Acquiring lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 959.053470] env[67820]: DEBUG oslo_concurrency.lockutils [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] Acquired lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 959.053633] env[67820]: DEBUG nova.network.neutron [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Refreshing network info cache for port 06ebaf73-6499-4b3f-b2b7-aaed764bbd75 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 959.424802] env[67820]: DEBUG nova.network.neutron [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Updated VIF entry in instance network info cache for port 06ebaf73-6499-4b3f-b2b7-aaed764bbd75. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 959.425297] env[67820]: DEBUG nova.network.neutron [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Updating instance_info_cache with network_info: [{"id": "8a78fb04-e2b9-48d2-ae15-17b5db6799e4", "address": "fa:16:3e:26:a3:cb", "network": {"id": "b77c10f1-cd4d-4f04-9b29-8ef0bb6568f1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-653284616", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.50", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0467beaa-08c6-44d6-b8a2-e9c609c21ff4", "external-id": "nsx-vlan-transportzone-540", "segmentation_id": 540, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a78fb04-e2", "ovs_interfaceid": "8a78fb04-e2b9-48d2-ae15-17b5db6799e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "54c4a29d-3779-4195-9d6d-69e7a8af9654", "address": "fa:16:3e:55:bb:f5", "network": {"id": "857bbc50-7975-41f5-95ab-f45a8b004187", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1676850120", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "51876cd6-d373-4edc-8595-254e5d631378", "external-id": "nsx-vlan-transportzone-916", "segmentation_id": 916, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54c4a29d-37", "ovs_interfaceid": "54c4a29d-3779-4195-9d6d-69e7a8af9654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "06ebaf73-6499-4b3f-b2b7-aaed764bbd75", "address": "fa:16:3e:9b:5f:65", "network": {"id": "b77c10f1-cd4d-4f04-9b29-8ef0bb6568f1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-653284616", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.41", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0467beaa-08c6-44d6-b8a2-e9c609c21ff4", "external-id": "nsx-vlan-transportzone-540", "segmentation_id": 540, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap06ebaf73-64", "ovs_interfaceid": "06ebaf73-6499-4b3f-b2b7-aaed764bbd75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 959.436661] env[67820]: DEBUG oslo_concurrency.lockutils [req-9d875f04-8745-4ba2-aa58-cddc2bd0f201 req-96c415e3-1ee9-4e0c-9803-b6ca73a88412 service nova] Releasing lock "refresh_cache-31ec9cab-abfb-4a73-8df8-057670201267" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 959.458071] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467349, 'name': CreateVM_Task, 'duration_secs': 0.378925} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 959.458297] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 959.460151] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 959.460418] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 959.461213] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 959.461771] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-999ab6ae-4d0b-4049-968a-4ac866eac0fc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 959.466757] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for the task: (returnval){ [ 959.466757] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5258bd78-f638-4b95-4a0e-87d048f9e88d" [ 959.466757] env[67820]: _type = "Task" [ 959.466757] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 959.479075] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5258bd78-f638-4b95-4a0e-87d048f9e88d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 959.980015] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 959.980355] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 959.980658] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 961.135081] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "9d6e6061-056f-4d2d-9860-22f154edc9ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 961.135525] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 976.935655] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "31ec9cab-abfb-4a73-8df8-057670201267" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 978.381763] env[67820]: DEBUG oslo_concurrency.lockutils [None req-08636822-4130-408a-b62d-184bdcbafa1d tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Acquiring lock "df369452-6ff5-4d06-98d3-edf0824a685b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 978.382098] env[67820]: DEBUG oslo_concurrency.lockutils [None req-08636822-4130-408a-b62d-184bdcbafa1d tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Lock "df369452-6ff5-4d06-98d3-edf0824a685b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 978.961029] env[67820]: DEBUG oslo_concurrency.lockutils [None req-2d8fd0b7-bed3-4f17-a73b-0e17a572f826 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Acquiring lock "0255953f-2dd2-48fa-9030-9aa96a61c504" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 978.961183] env[67820]: DEBUG oslo_concurrency.lockutils [None req-2d8fd0b7-bed3-4f17-a73b-0e17a572f826 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Lock "0255953f-2dd2-48fa-9030-9aa96a61c504" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 979.579576] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b53a6a26-f764-4101-a709-314d85770721 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Acquiring lock "7048bf62-9134-47f6-9638-e8911bf85e17" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 979.579878] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b53a6a26-f764-4101-a709-314d85770721 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Lock "7048bf62-9134-47f6-9638-e8911bf85e17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 990.677835] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f5fbc878-9593-4a76-a007-c3e81a17e59d tempest-ServerActionsTestJSON-1929179909 tempest-ServerActionsTestJSON-1929179909-project-member] Acquiring lock "3349a87f-da82-4990-ad15-7cb0fd446ec7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 990.678736] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f5fbc878-9593-4a76-a007-c3e81a17e59d tempest-ServerActionsTestJSON-1929179909 tempest-ServerActionsTestJSON-1929179909-project-member] Lock "3349a87f-da82-4990-ad15-7cb0fd446ec7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 992.184696] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a23b607f-586a-4a18-b887-6a0c5b2f221d tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Acquiring lock "c4413662-f234-4e54-8054-41c655c6412e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 992.184967] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a23b607f-586a-4a18-b887-6a0c5b2f221d tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "c4413662-f234-4e54-8054-41c655c6412e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.621815] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 999.766845] env[67820]: WARNING oslo_vmware.rw_handles [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 999.766845] env[67820]: ERROR oslo_vmware.rw_handles [ 999.767314] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 999.769830] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 999.770105] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Copying Virtual Disk [datastore1] vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/ee0d79f1-4da0-4bf8-ad98-36994c0aeca7/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 999.770401] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7fa6560b-a452-4511-bad5-b8458f3c7645 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.778191] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Waiting for the task: (returnval){ [ 999.778191] env[67820]: value = "task-3467350" [ 999.778191] env[67820]: _type = "Task" [ 999.778191] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 999.786052] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Task: {'id': task-3467350, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1000.288840] env[67820]: DEBUG oslo_vmware.exceptions [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1000.289019] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1000.289626] env[67820]: ERROR nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.289626] env[67820]: Faults: ['InvalidArgument'] [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Traceback (most recent call last): [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] yield resources [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self.driver.spawn(context, instance, image_meta, [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self._fetch_image_if_missing(context, vi) [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] image_cache(vi, tmp_image_ds_loc) [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] vm_util.copy_virtual_disk( [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] session._wait_for_task(vmdk_copy_task) [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] return self.wait_for_task(task_ref) [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] return evt.wait() [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] result = hub.switch() [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] return self.greenlet.switch() [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self.f(*self.args, **self.kw) [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] raise exceptions.translate_fault(task_info.error) [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Faults: ['InvalidArgument'] [ 1000.289626] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] [ 1000.290675] env[67820]: INFO nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Terminating instance [ 1000.291508] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1000.291737] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1000.292373] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1000.292567] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1000.292820] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7021b789-b38d-40fb-86b7-c03440cc203e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.295203] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87642ec6-6c9a-4d9a-982c-3489427a04cc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.302043] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1000.302344] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-10c46d11-6ab2-450b-a088-8ba0feab0ab7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.304668] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1000.304839] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1000.305787] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ebed316d-922c-4a12-975a-a8a208a254f9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.310478] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1000.310478] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52002b9f-34ec-1276-d5da-678aabbd88fa" [ 1000.310478] env[67820]: _type = "Task" [ 1000.310478] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1000.317878] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52002b9f-34ec-1276-d5da-678aabbd88fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1000.373018] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1000.373283] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1000.373605] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Deleting the datastore file [datastore1] 80f480dc-9bb8-4764-9b6b-793c0954962e {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1000.373966] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-122dbf87-49fc-46fa-9907-42b524fa4f9a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.380579] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Waiting for the task: (returnval){ [ 1000.380579] env[67820]: value = "task-3467352" [ 1000.380579] env[67820]: _type = "Task" [ 1000.380579] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1000.388204] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Task: {'id': task-3467352, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1000.623698] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.623698] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1000.820913] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1000.821200] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating directory with path [datastore1] vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1000.821450] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3ebef75-806e-405f-8725-2d195cc37f74 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.832410] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created directory with path [datastore1] vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1000.832596] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Fetch image to [datastore1] vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1000.832759] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1000.833542] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b05177ec-7a99-46ad-b980-018f9cebf0c1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.839934] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daa111fd-7563-4a28-a877-44d3a98a599c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.848709] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4330cd70-63d3-4e7b-82e0-d0d621ddf5af {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.879902] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2010fba6-ee22-4220-9b1f-930daaf27474 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.890518] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d942184b-f62d-4dc0-90f8-1739a22bb370 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.892202] env[67820]: DEBUG oslo_vmware.api [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Task: {'id': task-3467352, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073795} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1000.892434] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1000.892612] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1000.892784] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1000.892948] env[67820]: INFO nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1000.895039] env[67820]: DEBUG nova.compute.claims [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1000.895217] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.895430] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.913726] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1000.968061] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1001.026410] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1001.026600] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1001.317677] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4221368-5f00-4e56-acf5-8d0c69b74ad0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.325325] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1dc3b54-0046-40d2-9702-4170e5c69860 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.357501] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47058e19-2ec2-4799-9dfa-137bd1b50f77 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.365264] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-044aff77-25f4-4713-83d2-0b69e06c5609 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.380599] env[67820]: DEBUG nova.compute.provider_tree [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.391211] env[67820]: DEBUG nova.scheduler.client.report [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.405274] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.510s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.405707] env[67820]: ERROR nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1001.405707] env[67820]: Faults: ['InvalidArgument'] [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Traceback (most recent call last): [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self.driver.spawn(context, instance, image_meta, [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self._fetch_image_if_missing(context, vi) [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] image_cache(vi, tmp_image_ds_loc) [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] vm_util.copy_virtual_disk( [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] session._wait_for_task(vmdk_copy_task) [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] return self.wait_for_task(task_ref) [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] return evt.wait() [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] result = hub.switch() [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] return self.greenlet.switch() [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] self.f(*self.args, **self.kw) [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] raise exceptions.translate_fault(task_info.error) [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Faults: ['InvalidArgument'] [ 1001.405707] env[67820]: ERROR nova.compute.manager [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] [ 1001.407565] env[67820]: DEBUG nova.compute.utils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1001.408693] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Build of instance 80f480dc-9bb8-4764-9b6b-793c0954962e was re-scheduled: A specified parameter was not correct: fileType [ 1001.408693] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1001.411487] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1001.411487] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1001.411487] env[67820]: DEBUG nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1001.411487] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1001.621531] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.621531] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.621531] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1001.861803] env[67820]: DEBUG nova.network.neutron [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.877069] env[67820]: INFO nova.compute.manager [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Took 0.47 seconds to deallocate network for instance. [ 1001.975259] env[67820]: INFO nova.scheduler.client.report [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Deleted allocations for instance 80f480dc-9bb8-4764-9b6b-793c0954962e [ 1001.993735] env[67820]: DEBUG oslo_concurrency.lockutils [None req-461c49bc-6a65-4451-822b-61a17b165c52 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 424.795s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.994864] env[67820]: DEBUG oslo_concurrency.lockutils [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 223.915s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.995070] env[67820]: DEBUG oslo_concurrency.lockutils [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Acquiring lock "80f480dc-9bb8-4764-9b6b-793c0954962e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.995274] env[67820]: DEBUG oslo_concurrency.lockutils [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.995434] env[67820]: DEBUG oslo_concurrency.lockutils [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.997321] env[67820]: INFO nova.compute.manager [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Terminating instance [ 1001.999022] env[67820]: DEBUG nova.compute.manager [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1001.999224] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1001.999668] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e4f92f91-2689-4eac-8b7e-8eeff53c8d42 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.008697] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf607e43-185d-4855-9b0e-6187e323bcc3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.019667] env[67820]: DEBUG nova.compute.manager [None req-5a234b1e-0871-441b-9257-171ad7c3f418 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: 49515d53-7359-4bc6-8a26-95ffd4fe4ed4] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1002.040941] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 80f480dc-9bb8-4764-9b6b-793c0954962e could not be found. [ 1002.041219] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1002.041405] env[67820]: INFO nova.compute.manager [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1002.041658] env[67820]: DEBUG oslo.service.loopingcall [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1002.041939] env[67820]: DEBUG nova.compute.manager [-] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1002.042075] env[67820]: DEBUG nova.network.neutron [-] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1002.044425] env[67820]: DEBUG nova.compute.manager [None req-5a234b1e-0871-441b-9257-171ad7c3f418 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: 49515d53-7359-4bc6-8a26-95ffd4fe4ed4] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1002.068897] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5a234b1e-0871-441b-9257-171ad7c3f418 tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "49515d53-7359-4bc6-8a26-95ffd4fe4ed4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.531s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1002.070365] env[67820]: DEBUG nova.network.neutron [-] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1002.078867] env[67820]: INFO nova.compute.manager [-] [instance: 80f480dc-9bb8-4764-9b6b-793c0954962e] Took 0.04 seconds to deallocate network for instance. [ 1002.084053] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1002.135784] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.136326] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.138124] env[67820]: INFO nova.compute.claims [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1002.195893] env[67820]: DEBUG oslo_concurrency.lockutils [None req-54d16d69-6c3e-441d-964d-4da7cb27d689 tempest-ServersTestFqdnHostnames-1958061245 tempest-ServersTestFqdnHostnames-1958061245-project-member] Lock "80f480dc-9bb8-4764-9b6b-793c0954962e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.201s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1002.528314] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc069213-6969-4dde-88b0-68fff7d2a385 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.536031] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-574d4100-c9e2-4bf9-93bf-1d8c6092b588 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.565511] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03469f5d-a788-4a88-a517-f2419431aea0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.572878] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d500d318-5165-4c05-8ea8-cbad91179563 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.586048] env[67820]: DEBUG nova.compute.provider_tree [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1002.594788] env[67820]: DEBUG nova.scheduler.client.report [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1002.611310] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.475s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1002.611880] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1002.616109] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1002.638022] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1002.638022] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1002.638022] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1002.663879] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664228] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664330] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664384] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664514] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664636] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664758] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664878] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.664996] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.665130] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1002.665250] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1002.674673] env[67820]: DEBUG nova.compute.utils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1002.674673] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1002.674850] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1002.680168] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.684104] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1002.743426] env[67820]: DEBUG nova.policy [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9055d81ec19d46308356e18ca8240bc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35289e5018e7417eae449d502dda935d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1002.749664] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1002.775812] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1002.777241] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1002.777436] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1002.777635] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1002.777785] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1002.777933] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1002.778158] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1002.778411] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1002.778518] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1002.778700] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1002.778872] env[67820]: DEBUG nova.virt.hardware [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1002.779751] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8e7d9e1-6031-42cc-af54-b1674d84a82b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.788527] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d6c9508-09ee-468c-8128-a2597a64a68b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.335853] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Successfully created port: 17d786b4-5b86-4353-bdd3-fcdd0692e63f {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1004.110423] env[67820]: DEBUG nova.compute.manager [req-e8f93cda-fed9-4e7a-84fb-ee8a98310a97 req-f135a9e7-f59b-4e97-90fc-d3cfe96d47b9 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Received event network-vif-plugged-17d786b4-5b86-4353-bdd3-fcdd0692e63f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1004.110759] env[67820]: DEBUG oslo_concurrency.lockutils [req-e8f93cda-fed9-4e7a-84fb-ee8a98310a97 req-f135a9e7-f59b-4e97-90fc-d3cfe96d47b9 service nova] Acquiring lock "cfc7ee69-6da9-4f70-b245-17b12674feeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1004.110996] env[67820]: DEBUG oslo_concurrency.lockutils [req-e8f93cda-fed9-4e7a-84fb-ee8a98310a97 req-f135a9e7-f59b-4e97-90fc-d3cfe96d47b9 service nova] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1004.111396] env[67820]: DEBUG oslo_concurrency.lockutils [req-e8f93cda-fed9-4e7a-84fb-ee8a98310a97 req-f135a9e7-f59b-4e97-90fc-d3cfe96d47b9 service nova] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1004.111575] env[67820]: DEBUG nova.compute.manager [req-e8f93cda-fed9-4e7a-84fb-ee8a98310a97 req-f135a9e7-f59b-4e97-90fc-d3cfe96d47b9 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] No waiting events found dispatching network-vif-plugged-17d786b4-5b86-4353-bdd3-fcdd0692e63f {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1004.111739] env[67820]: WARNING nova.compute.manager [req-e8f93cda-fed9-4e7a-84fb-ee8a98310a97 req-f135a9e7-f59b-4e97-90fc-d3cfe96d47b9 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Received unexpected event network-vif-plugged-17d786b4-5b86-4353-bdd3-fcdd0692e63f for instance with vm_state building and task_state spawning. [ 1004.204758] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Successfully updated port: 17d786b4-5b86-4353-bdd3-fcdd0692e63f {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1004.217591] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1004.217664] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquired lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1004.217823] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1004.259930] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1004.427895] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Updating instance_info_cache with network_info: [{"id": "17d786b4-5b86-4353-bdd3-fcdd0692e63f", "address": "fa:16:3e:2d:34:d9", "network": {"id": "84e5b7bf-c268-4e36-a883-2d0fc3b2d97d", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-813923758-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35289e5018e7417eae449d502dda935d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ca83c3bc-f3ec-42ab-85b3-192512f766f3", "external-id": "nsx-vlan-transportzone-879", "segmentation_id": 879, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap17d786b4-5b", "ovs_interfaceid": "17d786b4-5b86-4353-bdd3-fcdd0692e63f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1004.441756] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Releasing lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1004.442162] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance network_info: |[{"id": "17d786b4-5b86-4353-bdd3-fcdd0692e63f", "address": "fa:16:3e:2d:34:d9", "network": {"id": "84e5b7bf-c268-4e36-a883-2d0fc3b2d97d", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-813923758-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35289e5018e7417eae449d502dda935d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ca83c3bc-f3ec-42ab-85b3-192512f766f3", "external-id": "nsx-vlan-transportzone-879", "segmentation_id": 879, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap17d786b4-5b", "ovs_interfaceid": "17d786b4-5b86-4353-bdd3-fcdd0692e63f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1004.442994] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2d:34:d9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ca83c3bc-f3ec-42ab-85b3-192512f766f3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '17d786b4-5b86-4353-bdd3-fcdd0692e63f', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1004.451114] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Creating folder: Project (35289e5018e7417eae449d502dda935d). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1004.451651] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-34b52681-b7c1-42ee-bd16-31367f2c77f3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.463012] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Created folder: Project (35289e5018e7417eae449d502dda935d) in parent group-v692668. [ 1004.463232] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Creating folder: Instances. Parent ref: group-v692719. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1004.463465] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9dbca59d-a61a-478d-8311-89f8954d7d06 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.472972] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Created folder: Instances in parent group-v692719. [ 1004.473231] env[67820]: DEBUG oslo.service.loopingcall [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1004.473414] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1004.473628] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c04e3354-62da-4254-881e-f29cd6efe9a7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.494160] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1004.494160] env[67820]: value = "task-3467355" [ 1004.494160] env[67820]: _type = "Task" [ 1004.494160] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1004.502248] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467355, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1004.621190] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1004.633925] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1004.634184] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1004.634354] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1004.634511] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1004.635950] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f427e42f-3bc1-48b9-81e0-f15778f4e8df {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.645117] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd324953-1351-42ef-a915-cd94fce68fd7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.661859] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8658539-3227-4b03-b32e-ebb94fd0aa03 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.669138] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef156791-a0f8-40de-b7e5-38b95e512640 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1004.698878] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180914MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1004.699463] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1004.699463] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1004.782029] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782029] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782206] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782206] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782317] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782489] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782541] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782618] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782736] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.782854] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1004.796218] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 22905445-5120-4cdd-b965-099001e4147c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.808404] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.820291] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 53757aa2-7013-42a5-94bd-e831c8f08c40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.832420] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 76ca6882-c1e0-4ae0-af8d-7d5673a13540 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.843804] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f77a97a9-3c8c-4484-8948-dd6dc9dc4077 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.855816] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 128f3466-5304-44dc-a569-2ba894f5333c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.865826] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 25a4642e-155a-473f-953c-b0fedbb6eac0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.878267] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8933061f-5489-447c-80ca-28d4a427349e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.889205] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d46bcc6e-6c3a-4200-a7f3-4c571ba6d819 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.900396] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d962be83-2769-465b-8628-1c1b656b6830 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.910662] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.920397] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.929876] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.939119] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance df369452-6ff5-4d06-98d3-edf0824a685b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.948917] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0255953f-2dd2-48fa-9030-9aa96a61c504 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.958367] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 7048bf62-9134-47f6-9638-e8911bf85e17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.968451] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3349a87f-da82-4990-ad15-7cb0fd446ec7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.980859] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c4413662-f234-4e54-8054-41c655c6412e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1004.981226] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1004.981312] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1005.005384] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467355, 'name': CreateVM_Task, 'duration_secs': 0.390526} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1005.005549] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1005.006237] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1005.006403] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1005.006765] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1005.006947] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bae392d4-0b1b-4c22-b47c-3707fd9e2c68 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.011209] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Waiting for the task: (returnval){ [ 1005.011209] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52af7b9f-610e-b7a5-9ccb-06bd9bb088b0" [ 1005.011209] env[67820]: _type = "Task" [ 1005.011209] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1005.020892] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52af7b9f-610e-b7a5-9ccb-06bd9bb088b0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1005.304036] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e06259f-ecad-46a7-8302-fc910a183131 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.310455] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fd759de-2604-44a6-882f-e487c475f0ad {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.339240] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fb731a5-3239-4bec-9072-198c9ef109e2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.346241] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c30a90b-a4ba-47ee-9654-eea2f55d15da {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.359442] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1005.367747] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1005.383184] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1005.383366] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.684s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1005.522513] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1005.522513] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1005.522513] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1006.164287] env[67820]: DEBUG nova.compute.manager [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Received event network-changed-17d786b4-5b86-4353-bdd3-fcdd0692e63f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1006.164287] env[67820]: DEBUG nova.compute.manager [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Refreshing instance network info cache due to event network-changed-17d786b4-5b86-4353-bdd3-fcdd0692e63f. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1006.164287] env[67820]: DEBUG oslo_concurrency.lockutils [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] Acquiring lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1006.164287] env[67820]: DEBUG oslo_concurrency.lockutils [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] Acquired lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1006.164287] env[67820]: DEBUG nova.network.neutron [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Refreshing network info cache for port 17d786b4-5b86-4353-bdd3-fcdd0692e63f {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1006.378948] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1006.499311] env[67820]: DEBUG nova.network.neutron [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Updated VIF entry in instance network info cache for port 17d786b4-5b86-4353-bdd3-fcdd0692e63f. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1006.499662] env[67820]: DEBUG nova.network.neutron [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Updating instance_info_cache with network_info: [{"id": "17d786b4-5b86-4353-bdd3-fcdd0692e63f", "address": "fa:16:3e:2d:34:d9", "network": {"id": "84e5b7bf-c268-4e36-a883-2d0fc3b2d97d", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-813923758-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35289e5018e7417eae449d502dda935d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ca83c3bc-f3ec-42ab-85b3-192512f766f3", "external-id": "nsx-vlan-transportzone-879", "segmentation_id": 879, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap17d786b4-5b", "ovs_interfaceid": "17d786b4-5b86-4353-bdd3-fcdd0692e63f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1006.509610] env[67820]: DEBUG oslo_concurrency.lockutils [req-4ddceb08-52b3-4563-964b-50a04852a9f5 req-c264896e-015f-4c29-bbe3-0219a8a61100 service nova] Releasing lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1006.620574] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1018.759915] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1018.760410] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1022.805319] env[67820]: DEBUG oslo_concurrency.lockutils [None req-dd193638-d486-4a83-a771-da2fb987578f tempest-ServerGroupTestJSON-1578419083 tempest-ServerGroupTestJSON-1578419083-project-member] Acquiring lock "3873b861-181a-4242-a194-dca1f37f8715" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1022.805633] env[67820]: DEBUG oslo_concurrency.lockutils [None req-dd193638-d486-4a83-a771-da2fb987578f tempest-ServerGroupTestJSON-1578419083 tempest-ServerGroupTestJSON-1578419083-project-member] Lock "3873b861-181a-4242-a194-dca1f37f8715" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1044.923888] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0335e866-f4cc-4f75-9126-2eb6970d1b83 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "49a6ccef-af81-4177-93f3-0581c86242c4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1044.924166] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0335e866-f4cc-4f75-9126-2eb6970d1b83 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "49a6ccef-af81-4177-93f3-0581c86242c4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1049.780905] env[67820]: WARNING oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1049.780905] env[67820]: ERROR oslo_vmware.rw_handles [ 1049.781675] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1049.783581] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1049.783843] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Copying Virtual Disk [datastore1] vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/ffc9f984-4262-44e7-abdf-f34e05f94aa0/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1049.784151] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-71ba601e-e44d-4871-8afa-716c27ea5459 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1049.792338] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1049.792338] env[67820]: value = "task-3467356" [ 1049.792338] env[67820]: _type = "Task" [ 1049.792338] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1049.800238] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467356, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1050.303282] env[67820]: DEBUG oslo_vmware.exceptions [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1050.303550] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1050.304118] env[67820]: ERROR nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1050.304118] env[67820]: Faults: ['InvalidArgument'] [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Traceback (most recent call last): [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] yield resources [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self.driver.spawn(context, instance, image_meta, [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self._fetch_image_if_missing(context, vi) [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] image_cache(vi, tmp_image_ds_loc) [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] vm_util.copy_virtual_disk( [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] session._wait_for_task(vmdk_copy_task) [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] return self.wait_for_task(task_ref) [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] return evt.wait() [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] result = hub.switch() [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] return self.greenlet.switch() [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self.f(*self.args, **self.kw) [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] raise exceptions.translate_fault(task_info.error) [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Faults: ['InvalidArgument'] [ 1050.304118] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] [ 1050.305349] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Terminating instance [ 1050.305945] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1050.306249] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1050.306492] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bd811f34-f53f-43b2-a638-a29c395a3eea {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.308549] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1050.308739] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1050.309459] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c174c37-fc1a-4214-a551-0961e47c0756 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.317222] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1050.317438] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-00267d3a-630a-4d93-82b7-fe2d6c987a1c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.319500] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1050.319673] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1050.320669] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d7b54f26-d466-42eb-a841-9c51373358fa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.325318] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1050.325318] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5248b5ed-c9c9-3163-b15e-7f79484c6c73" [ 1050.325318] env[67820]: _type = "Task" [ 1050.325318] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1050.332441] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5248b5ed-c9c9-3163-b15e-7f79484c6c73, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1050.384652] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1050.384872] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1050.385061] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleting the datastore file [datastore1] f1fcb6fc-97d9-46ed-ae53-a27f58992378 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1050.385334] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1e59c6ed-f824-4273-9444-5498a065bc99 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.391584] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1050.391584] env[67820]: value = "task-3467358" [ 1050.391584] env[67820]: _type = "Task" [ 1050.391584] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1050.399843] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467358, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1050.836079] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1050.836079] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating directory with path [datastore1] vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1050.836416] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5c48bce0-ed0c-4fc8-9e88-619ddf8950bb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.846891] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created directory with path [datastore1] vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1050.847079] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Fetch image to [datastore1] vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1050.847249] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1050.847955] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff9177c4-20da-49be-a1c7-f5fdefb4b02f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.854130] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68eb17e7-5fe5-44fc-9093-2fda3daf28f7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.862911] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-740be1b8-ba30-42b1-b457-029b51f11bba {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.895418] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faee56ff-3919-4315-a2ff-79267a4f4b2f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.903595] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467358, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077114} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1050.903776] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e06c684c-c93a-4de4-8ac8-f4ba4bd1e8d7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.905377] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1050.905559] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1050.905727] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1050.905899] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1050.908010] env[67820]: DEBUG nova.compute.claims [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1050.908184] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1050.908392] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1050.926975] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1050.975881] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1051.035617] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1051.035806] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1051.248826] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baad4a48-4159-4170-91ee-3ff6877f9f3d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.257033] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c522c6e-cf00-4e9f-a3a6-87f0ff26a4a7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.286149] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69480a53-7850-4ef8-b1b4-aa25a05f3a9c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.292895] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35370113-143a-48c2-a098-8f404c16c44e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.305367] env[67820]: DEBUG nova.compute.provider_tree [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1051.314346] env[67820]: DEBUG nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1051.332567] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.424s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1051.333103] env[67820]: ERROR nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1051.333103] env[67820]: Faults: ['InvalidArgument'] [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Traceback (most recent call last): [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self.driver.spawn(context, instance, image_meta, [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self._fetch_image_if_missing(context, vi) [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] image_cache(vi, tmp_image_ds_loc) [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] vm_util.copy_virtual_disk( [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] session._wait_for_task(vmdk_copy_task) [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] return self.wait_for_task(task_ref) [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] return evt.wait() [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] result = hub.switch() [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] return self.greenlet.switch() [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] self.f(*self.args, **self.kw) [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] raise exceptions.translate_fault(task_info.error) [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Faults: ['InvalidArgument'] [ 1051.333103] env[67820]: ERROR nova.compute.manager [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] [ 1051.334056] env[67820]: DEBUG nova.compute.utils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1051.335128] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Build of instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 was re-scheduled: A specified parameter was not correct: fileType [ 1051.335128] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1051.335490] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1051.335658] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1051.335824] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1051.335993] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1051.675755] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1051.693151] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Took 0.35 seconds to deallocate network for instance. [ 1051.806249] env[67820]: INFO nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleted allocations for instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 [ 1051.831065] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 471.612s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1051.832298] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 269.915s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1051.832513] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1051.832708] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1051.832871] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1051.834890] env[67820]: INFO nova.compute.manager [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Terminating instance [ 1051.836601] env[67820]: DEBUG nova.compute.manager [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1051.836879] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1051.837195] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4b1d6a26-8018-4419-8761-7e9fd2d09d05 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.846213] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0c82a06-2b73-4452-8026-6546cc85f930 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.858062] env[67820]: DEBUG nova.compute.manager [None req-8bb51154-3876-46f5-a1e7-ac76a980bb7f tempest-ServerMetadataNegativeTestJSON-1278293935 tempest-ServerMetadataNegativeTestJSON-1278293935-project-member] [instance: 22905445-5120-4cdd-b965-099001e4147c] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1051.879105] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f1fcb6fc-97d9-46ed-ae53-a27f58992378 could not be found. [ 1051.879314] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1051.879485] env[67820]: INFO nova.compute.manager [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1051.879736] env[67820]: DEBUG oslo.service.loopingcall [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1051.879976] env[67820]: DEBUG nova.compute.manager [-] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1051.880082] env[67820]: DEBUG nova.network.neutron [-] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1051.884706] env[67820]: DEBUG nova.compute.manager [None req-8bb51154-3876-46f5-a1e7-ac76a980bb7f tempest-ServerMetadataNegativeTestJSON-1278293935 tempest-ServerMetadataNegativeTestJSON-1278293935-project-member] [instance: 22905445-5120-4cdd-b965-099001e4147c] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1051.902611] env[67820]: DEBUG nova.network.neutron [-] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1051.904687] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8bb51154-3876-46f5-a1e7-ac76a980bb7f tempest-ServerMetadataNegativeTestJSON-1278293935 tempest-ServerMetadataNegativeTestJSON-1278293935-project-member] Lock "22905445-5120-4cdd-b965-099001e4147c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.440s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1051.911084] env[67820]: INFO nova.compute.manager [-] [instance: f1fcb6fc-97d9-46ed-ae53-a27f58992378] Took 0.03 seconds to deallocate network for instance. [ 1051.918897] env[67820]: DEBUG nova.compute.manager [None req-06922a7e-d854-425d-8436-990306a135a1 tempest-ServersNegativeTestJSON-141145616 tempest-ServersNegativeTestJSON-141145616-project-member] [instance: 37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1051.943661] env[67820]: DEBUG nova.compute.manager [None req-06922a7e-d854-425d-8436-990306a135a1 tempest-ServersNegativeTestJSON-141145616 tempest-ServersNegativeTestJSON-141145616-project-member] [instance: 37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1051.966211] env[67820]: DEBUG oslo_concurrency.lockutils [None req-06922a7e-d854-425d-8436-990306a135a1 tempest-ServersNegativeTestJSON-141145616 tempest-ServersNegativeTestJSON-141145616-project-member] Lock "37b01cc5-8cf6-40c7-b7ac-4ba33a9d70fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.433s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1051.976747] env[67820]: DEBUG nova.compute.manager [None req-88a74e22-7e18-4f46-995e-f8ddad65584e tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 53757aa2-7013-42a5-94bd-e831c8f08c40] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.003634] env[67820]: DEBUG nova.compute.manager [None req-88a74e22-7e18-4f46-995e-f8ddad65584e tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] [instance: 53757aa2-7013-42a5-94bd-e831c8f08c40] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.010695] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f132f010-1b65-421f-82a6-c4b88d76d441 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "f1fcb6fc-97d9-46ed-ae53-a27f58992378" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.178s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.025658] env[67820]: DEBUG oslo_concurrency.lockutils [None req-88a74e22-7e18-4f46-995e-f8ddad65584e tempest-DeleteServersAdminTestJSON-1264121813 tempest-DeleteServersAdminTestJSON-1264121813-project-member] Lock "53757aa2-7013-42a5-94bd-e831c8f08c40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.674s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.033917] env[67820]: DEBUG nova.compute.manager [None req-a161f849-d48c-49bb-b4e7-346d4f032f46 tempest-ServersTestManualDisk-1039961554 tempest-ServersTestManualDisk-1039961554-project-member] [instance: 76ca6882-c1e0-4ae0-af8d-7d5673a13540] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.059135] env[67820]: DEBUG nova.compute.manager [None req-a161f849-d48c-49bb-b4e7-346d4f032f46 tempest-ServersTestManualDisk-1039961554 tempest-ServersTestManualDisk-1039961554-project-member] [instance: 76ca6882-c1e0-4ae0-af8d-7d5673a13540] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.082883] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a161f849-d48c-49bb-b4e7-346d4f032f46 tempest-ServersTestManualDisk-1039961554 tempest-ServersTestManualDisk-1039961554-project-member] Lock "76ca6882-c1e0-4ae0-af8d-7d5673a13540" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.495s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.092276] env[67820]: DEBUG nova.compute.manager [None req-57399e52-706b-46da-916c-3f60fae914f1 tempest-ServersNegativeTestMultiTenantJSON-57916155 tempest-ServersNegativeTestMultiTenantJSON-57916155-project-member] [instance: f77a97a9-3c8c-4484-8948-dd6dc9dc4077] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.114915] env[67820]: DEBUG nova.compute.manager [None req-57399e52-706b-46da-916c-3f60fae914f1 tempest-ServersNegativeTestMultiTenantJSON-57916155 tempest-ServersNegativeTestMultiTenantJSON-57916155-project-member] [instance: f77a97a9-3c8c-4484-8948-dd6dc9dc4077] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.137397] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57399e52-706b-46da-916c-3f60fae914f1 tempest-ServersNegativeTestMultiTenantJSON-57916155 tempest-ServersNegativeTestMultiTenantJSON-57916155-project-member] Lock "f77a97a9-3c8c-4484-8948-dd6dc9dc4077" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.543s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.148704] env[67820]: DEBUG nova.compute.manager [None req-d71f314c-5f47-475e-a05e-4febc95e1c7f tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] [instance: 128f3466-5304-44dc-a569-2ba894f5333c] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.192551] env[67820]: DEBUG nova.compute.manager [None req-d71f314c-5f47-475e-a05e-4febc95e1c7f tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] [instance: 128f3466-5304-44dc-a569-2ba894f5333c] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.214889] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d71f314c-5f47-475e-a05e-4febc95e1c7f tempest-AttachVolumeTestJSON-754767963 tempest-AttachVolumeTestJSON-754767963-project-member] Lock "128f3466-5304-44dc-a569-2ba894f5333c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.678s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.223587] env[67820]: DEBUG nova.compute.manager [None req-79f5bb8e-dfdb-4766-93c3-007a34828c33 tempest-ServerPasswordTestJSON-958337142 tempest-ServerPasswordTestJSON-958337142-project-member] [instance: 25a4642e-155a-473f-953c-b0fedbb6eac0] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.245494] env[67820]: DEBUG nova.compute.manager [None req-79f5bb8e-dfdb-4766-93c3-007a34828c33 tempest-ServerPasswordTestJSON-958337142 tempest-ServerPasswordTestJSON-958337142-project-member] [instance: 25a4642e-155a-473f-953c-b0fedbb6eac0] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.266145] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79f5bb8e-dfdb-4766-93c3-007a34828c33 tempest-ServerPasswordTestJSON-958337142 tempest-ServerPasswordTestJSON-958337142-project-member] Lock "25a4642e-155a-473f-953c-b0fedbb6eac0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.595s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.274960] env[67820]: DEBUG nova.compute.manager [None req-f0f99f06-0777-43a2-ad81-ef5b64fe9c5a tempest-ServerMetadataTestJSON-272358107 tempest-ServerMetadataTestJSON-272358107-project-member] [instance: 8933061f-5489-447c-80ca-28d4a427349e] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.298518] env[67820]: DEBUG nova.compute.manager [None req-f0f99f06-0777-43a2-ad81-ef5b64fe9c5a tempest-ServerMetadataTestJSON-272358107 tempest-ServerMetadataTestJSON-272358107-project-member] [instance: 8933061f-5489-447c-80ca-28d4a427349e] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.319029] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f0f99f06-0777-43a2-ad81-ef5b64fe9c5a tempest-ServerMetadataTestJSON-272358107 tempest-ServerMetadataTestJSON-272358107-project-member] Lock "8933061f-5489-447c-80ca-28d4a427349e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.924s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.327997] env[67820]: DEBUG nova.compute.manager [None req-5e5baf12-b83d-4cad-83d4-d9143ad3b1f6 tempest-InstanceActionsV221TestJSON-1418511094 tempest-InstanceActionsV221TestJSON-1418511094-project-member] [instance: d46bcc6e-6c3a-4200-a7f3-4c571ba6d819] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.352599] env[67820]: DEBUG nova.compute.manager [None req-5e5baf12-b83d-4cad-83d4-d9143ad3b1f6 tempest-InstanceActionsV221TestJSON-1418511094 tempest-InstanceActionsV221TestJSON-1418511094-project-member] [instance: d46bcc6e-6c3a-4200-a7f3-4c571ba6d819] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.377494] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5e5baf12-b83d-4cad-83d4-d9143ad3b1f6 tempest-InstanceActionsV221TestJSON-1418511094 tempest-InstanceActionsV221TestJSON-1418511094-project-member] Lock "d46bcc6e-6c3a-4200-a7f3-4c571ba6d819" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.636s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.388245] env[67820]: DEBUG nova.compute.manager [None req-68d36599-6926-41e4-ac0f-b2fba62cc1ca tempest-ServerAddressesNegativeTestJSON-708644920 tempest-ServerAddressesNegativeTestJSON-708644920-project-member] [instance: d962be83-2769-465b-8628-1c1b656b6830] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.410882] env[67820]: DEBUG nova.compute.manager [None req-68d36599-6926-41e4-ac0f-b2fba62cc1ca tempest-ServerAddressesNegativeTestJSON-708644920 tempest-ServerAddressesNegativeTestJSON-708644920-project-member] [instance: d962be83-2769-465b-8628-1c1b656b6830] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1052.430715] env[67820]: DEBUG oslo_concurrency.lockutils [None req-68d36599-6926-41e4-ac0f-b2fba62cc1ca tempest-ServerAddressesNegativeTestJSON-708644920 tempest-ServerAddressesNegativeTestJSON-708644920-project-member] Lock "d962be83-2769-465b-8628-1c1b656b6830" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.240s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.439386] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.495205] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1052.495467] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.497063] env[67820]: INFO nova.compute.claims [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1052.780745] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9c48227-8b23-4381-b841-0565eb787b79 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.789520] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d9aacae-13b4-4240-991e-11a9af630239 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.820493] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da64d27c-74c5-4e10-b2d9-924f364a4ec6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.827517] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd457b04-4def-4df0-8893-32cb949d512a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.840930] env[67820]: DEBUG nova.compute.provider_tree [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1052.849533] env[67820]: DEBUG nova.scheduler.client.report [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1052.863795] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.864325] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1052.896929] env[67820]: DEBUG nova.compute.utils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1052.898404] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1052.898571] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1052.909390] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1052.971532] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1052.984433] env[67820]: DEBUG nova.policy [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bb006fa217a9496f819f6d98acbd9c23', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98bd16e535c84bcd932ef0a99d723cc2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1052.997021] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1052.997195] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1052.997352] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1052.997532] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1052.997677] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1052.997823] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1052.998077] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1052.998271] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1052.998442] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1052.998605] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1052.998773] env[67820]: DEBUG nova.virt.hardware [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1052.999646] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c3b8c4d-a77d-440e-b7bc-50cedb2a30bf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.007297] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aef600d8-9314-40b1-9bcd-cc314f0591df {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.337381] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Successfully created port: 0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1054.178753] env[67820]: DEBUG nova.compute.manager [req-bc09d6ad-f9c6-417c-899d-e6e6521c384c req-4331f562-5eaf-43bf-b7f9-2bb64db9d4e1 service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Received event network-vif-plugged-0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1054.178753] env[67820]: DEBUG oslo_concurrency.lockutils [req-bc09d6ad-f9c6-417c-899d-e6e6521c384c req-4331f562-5eaf-43bf-b7f9-2bb64db9d4e1 service nova] Acquiring lock "45a68888-979e-4255-98a0-bcb289f57830-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1054.179099] env[67820]: DEBUG oslo_concurrency.lockutils [req-bc09d6ad-f9c6-417c-899d-e6e6521c384c req-4331f562-5eaf-43bf-b7f9-2bb64db9d4e1 service nova] Lock "45a68888-979e-4255-98a0-bcb289f57830-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1054.181067] env[67820]: DEBUG oslo_concurrency.lockutils [req-bc09d6ad-f9c6-417c-899d-e6e6521c384c req-4331f562-5eaf-43bf-b7f9-2bb64db9d4e1 service nova] Lock "45a68888-979e-4255-98a0-bcb289f57830-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.181288] env[67820]: DEBUG nova.compute.manager [req-bc09d6ad-f9c6-417c-899d-e6e6521c384c req-4331f562-5eaf-43bf-b7f9-2bb64db9d4e1 service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] No waiting events found dispatching network-vif-plugged-0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1054.181460] env[67820]: WARNING nova.compute.manager [req-bc09d6ad-f9c6-417c-899d-e6e6521c384c req-4331f562-5eaf-43bf-b7f9-2bb64db9d4e1 service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Received unexpected event network-vif-plugged-0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8 for instance with vm_state building and task_state spawning. [ 1054.277888] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Successfully updated port: 0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1054.290596] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "refresh_cache-45a68888-979e-4255-98a0-bcb289f57830" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1054.290741] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired lock "refresh_cache-45a68888-979e-4255-98a0-bcb289f57830" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1054.290887] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1054.333526] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1054.486967] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Updating instance_info_cache with network_info: [{"id": "0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8", "address": "fa:16:3e:38:d6:4c", "network": {"id": "d4b6e4dc-1215-4b9c-bfcd-c25045102f1c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-844337455-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98bd16e535c84bcd932ef0a99d723cc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a485857d-7086-4dcf-9d65-d0dcd177fcb0", "external-id": "nsx-vlan-transportzone-232", "segmentation_id": 232, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b11cdd5-7f", "ovs_interfaceid": "0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1054.517487] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Releasing lock "refresh_cache-45a68888-979e-4255-98a0-bcb289f57830" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1054.517791] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Instance network_info: |[{"id": "0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8", "address": "fa:16:3e:38:d6:4c", "network": {"id": "d4b6e4dc-1215-4b9c-bfcd-c25045102f1c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-844337455-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98bd16e535c84bcd932ef0a99d723cc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a485857d-7086-4dcf-9d65-d0dcd177fcb0", "external-id": "nsx-vlan-transportzone-232", "segmentation_id": 232, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b11cdd5-7f", "ovs_interfaceid": "0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1054.518192] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:38:d6:4c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a485857d-7086-4dcf-9d65-d0dcd177fcb0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1054.525943] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating folder: Project (98bd16e535c84bcd932ef0a99d723cc2). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1054.526464] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3869d151-3c15-4bcf-b120-f1bfd5453ac5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.536863] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Created folder: Project (98bd16e535c84bcd932ef0a99d723cc2) in parent group-v692668. [ 1054.537056] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating folder: Instances. Parent ref: group-v692722. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1054.537270] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9a09646c-b0e4-436a-a7be-54549c3be364 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.545124] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Created folder: Instances in parent group-v692722. [ 1054.545340] env[67820]: DEBUG oslo.service.loopingcall [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1054.545511] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1054.545691] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-090dc02b-c091-4f76-bb8d-0217b1accef8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.563798] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1054.563798] env[67820]: value = "task-3467361" [ 1054.563798] env[67820]: _type = "Task" [ 1054.563798] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1054.572036] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467361, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1055.073595] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467361, 'name': CreateVM_Task, 'duration_secs': 0.276168} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1055.073783] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1055.074472] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1055.074974] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1055.074974] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1055.075193] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6a8384a6-7ccc-47b6-8f51-1174055f4f91 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.079574] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1055.079574] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5278b2ff-462e-50ac-88f6-6053067587ea" [ 1055.079574] env[67820]: _type = "Task" [ 1055.079574] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1055.086837] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5278b2ff-462e-50ac-88f6-6053067587ea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1055.590450] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1055.590807] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1055.590903] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1056.207956] env[67820]: DEBUG nova.compute.manager [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Received event network-changed-0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1056.207956] env[67820]: DEBUG nova.compute.manager [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Refreshing instance network info cache due to event network-changed-0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1056.208185] env[67820]: DEBUG oslo_concurrency.lockutils [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] Acquiring lock "refresh_cache-45a68888-979e-4255-98a0-bcb289f57830" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1056.208337] env[67820]: DEBUG oslo_concurrency.lockutils [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] Acquired lock "refresh_cache-45a68888-979e-4255-98a0-bcb289f57830" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1056.208507] env[67820]: DEBUG nova.network.neutron [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Refreshing network info cache for port 0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1056.580160] env[67820]: DEBUG nova.network.neutron [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Updated VIF entry in instance network info cache for port 0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1056.580536] env[67820]: DEBUG nova.network.neutron [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Updating instance_info_cache with network_info: [{"id": "0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8", "address": "fa:16:3e:38:d6:4c", "network": {"id": "d4b6e4dc-1215-4b9c-bfcd-c25045102f1c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-844337455-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98bd16e535c84bcd932ef0a99d723cc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a485857d-7086-4dcf-9d65-d0dcd177fcb0", "external-id": "nsx-vlan-transportzone-232", "segmentation_id": 232, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b11cdd5-7f", "ovs_interfaceid": "0b11cdd5-7fe5-4d45-ba0f-8851eb6824b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1056.589973] env[67820]: DEBUG oslo_concurrency.lockutils [req-ae820df5-6ed7-4c94-8496-ea8262592ae2 req-25b6bf90-74fd-43d5-9af3-05fbd23654ac service nova] Releasing lock "refresh_cache-45a68888-979e-4255-98a0-bcb289f57830" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1059.621660] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1061.621816] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1061.622164] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1061.821470] env[67820]: DEBUG oslo_concurrency.lockutils [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "45a68888-979e-4255-98a0-bcb289f57830" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1062.622097] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.622375] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1062.622451] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1062.645456] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.645637] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.645783] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.645915] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.646052] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.646177] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.646363] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.646505] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.646619] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.646737] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1062.646858] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1062.647360] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.647538] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1063.621754] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.615945] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.621297] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1066.632851] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1066.633223] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1066.633337] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1066.633395] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1066.635152] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2fba933-1c80-4108-baed-ea6df690bd5c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.644760] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05480a67-17d2-4d7a-b59c-fb06417d8c89 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.659842] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0c18882-fc3f-4049-9fc4-c4de9c0e157c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.667601] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4150b74-3443-4b12-82e9-303ec502578c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.697654] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180922MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1066.697853] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1066.698135] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1066.788341] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.788702] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.788932] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.789235] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.790205] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.790387] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.790520] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.790646] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.790764] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.790877] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1066.809186] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.823084] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.833730] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance df369452-6ff5-4d06-98d3-edf0824a685b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.849833] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0255953f-2dd2-48fa-9030-9aa96a61c504 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.863120] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 7048bf62-9134-47f6-9638-e8911bf85e17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.876671] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3349a87f-da82-4990-ad15-7cb0fd446ec7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.893826] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c4413662-f234-4e54-8054-41c655c6412e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.905191] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.919538] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3873b861-181a-4242-a194-dca1f37f8715 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.930326] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 49a6ccef-af81-4177-93f3-0581c86242c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1066.930566] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1066.930709] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1067.201128] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c54d2e58-91b1-429b-b65c-52747005769d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.211676] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eec020fa-14e4-4282-985c-2e9ee57fc852 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.242583] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39e4773c-f5e7-4d6e-a392-96f3513e767e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.250106] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5526366a-e4b9-4461-91bf-b5f94a5a0208 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1067.263169] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1067.272735] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1067.286395] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1067.286592] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1068.864582] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1068.864582] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1069.287432] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1099.798590] env[67820]: WARNING oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1099.798590] env[67820]: ERROR oslo_vmware.rw_handles [ 1099.799398] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1099.801410] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1099.801653] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Copying Virtual Disk [datastore1] vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/a7b931c7-4ad7-459c-9f16-78b097174811/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1099.801942] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b58e911a-8018-4d0d-90cf-7a589f9b85f9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.809778] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1099.809778] env[67820]: value = "task-3467362" [ 1099.809778] env[67820]: _type = "Task" [ 1099.809778] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1099.818907] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467362, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1100.319727] env[67820]: DEBUG oslo_vmware.exceptions [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1100.320025] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1100.320591] env[67820]: ERROR nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1100.320591] env[67820]: Faults: ['InvalidArgument'] [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Traceback (most recent call last): [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] yield resources [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self.driver.spawn(context, instance, image_meta, [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self._fetch_image_if_missing(context, vi) [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] image_cache(vi, tmp_image_ds_loc) [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] vm_util.copy_virtual_disk( [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] session._wait_for_task(vmdk_copy_task) [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] return self.wait_for_task(task_ref) [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] return evt.wait() [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] result = hub.switch() [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] return self.greenlet.switch() [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self.f(*self.args, **self.kw) [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] raise exceptions.translate_fault(task_info.error) [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Faults: ['InvalidArgument'] [ 1100.320591] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] [ 1100.321791] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Terminating instance [ 1100.322455] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1100.322675] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1100.322903] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b47272ca-15cf-443c-935b-7e592894b6fe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.324981] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1100.325183] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1100.325898] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c34449a-c608-43df-9c65-fe41f2aa210b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.333595] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1100.333795] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8f4cd118-ab11-4e6a-a25e-bf7082ca8c77 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.335819] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1100.335988] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1100.336905] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2d82cecd-2506-416d-afcd-dcb5ddfdd60d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.342012] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1100.342012] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52cb456b-355a-167f-c5e4-c184498554a5" [ 1100.342012] env[67820]: _type = "Task" [ 1100.342012] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1100.349497] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52cb456b-355a-167f-c5e4-c184498554a5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1100.411050] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1100.411050] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1100.411270] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleting the datastore file [datastore1] 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1100.411533] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-910c3d0d-2b42-4618-bc87-8c6eca7b95e8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.417447] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1100.417447] env[67820]: value = "task-3467364" [ 1100.417447] env[67820]: _type = "Task" [ 1100.417447] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1100.424995] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467364, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1100.851840] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1100.852148] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating directory with path [datastore1] vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1100.852361] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-20fa1e7d-5b34-484f-b8f4-572374574afd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.862960] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Created directory with path [datastore1] vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1100.863162] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Fetch image to [datastore1] vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1100.863326] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1100.864073] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68257fa2-a1fb-484d-a9ba-046b4c5a4aa1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.870341] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f301e0f2-22f0-4a74-b7e4-b6847207b6a5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.879242] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-327bb774-dc3e-4f57-9af2-670293eef68a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.910852] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca26855-e934-4373-b34b-1e31e90dd58b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.916249] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-46952c48-021a-4c1f-b731-086bf41274cf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.925575] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467364, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06446} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1100.925818] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1100.926016] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1100.926195] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1100.926367] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1100.928425] env[67820]: DEBUG nova.compute.claims [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1100.928599] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1100.928851] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1100.950655] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1101.003804] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1101.063139] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1101.063402] env[67820]: DEBUG oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1101.289801] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61009a74-91a6-4d5d-8c73-70cd7c5cff76 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.299148] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d86968e2-314b-41f0-8b6c-33ca1a3a0df7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.328911] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab464ae2-2c68-4b82-894d-dd6425be6132 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.336417] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5adc375e-9096-4dae-a543-b112ba17e944 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.351371] env[67820]: DEBUG nova.compute.provider_tree [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1101.361103] env[67820]: DEBUG nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1101.372939] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.444s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.373797] env[67820]: ERROR nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1101.373797] env[67820]: Faults: ['InvalidArgument'] [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Traceback (most recent call last): [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self.driver.spawn(context, instance, image_meta, [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self._fetch_image_if_missing(context, vi) [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] image_cache(vi, tmp_image_ds_loc) [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] vm_util.copy_virtual_disk( [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] session._wait_for_task(vmdk_copy_task) [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] return self.wait_for_task(task_ref) [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] return evt.wait() [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] result = hub.switch() [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] return self.greenlet.switch() [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] self.f(*self.args, **self.kw) [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] raise exceptions.translate_fault(task_info.error) [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Faults: ['InvalidArgument'] [ 1101.373797] env[67820]: ERROR nova.compute.manager [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] [ 1101.375576] env[67820]: DEBUG nova.compute.utils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1101.377169] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Build of instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 was re-scheduled: A specified parameter was not correct: fileType [ 1101.377169] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1101.377750] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1101.378143] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1101.378439] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1101.378715] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1101.678617] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1101.693496] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Took 0.31 seconds to deallocate network for instance. [ 1101.785972] env[67820]: INFO nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleted allocations for instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 [ 1101.811496] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 521.537s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.812722] env[67820]: DEBUG oslo_concurrency.lockutils [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 320.402s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.812939] env[67820]: DEBUG oslo_concurrency.lockutils [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1101.813148] env[67820]: DEBUG oslo_concurrency.lockutils [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.813309] env[67820]: DEBUG oslo_concurrency.lockutils [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.815244] env[67820]: INFO nova.compute.manager [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Terminating instance [ 1101.816837] env[67820]: DEBUG nova.compute.manager [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1101.817035] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1101.817498] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8e1e1bd0-34d3-4fef-8d7b-6322e756c693 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.826653] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f615da33-c9a4-4fb2-af2d-c0a587ffa7fd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.837572] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1101.857296] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 939b4cbc-aa1c-4995-9675-4c3d1f4dce55 could not be found. [ 1101.857542] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1101.857685] env[67820]: INFO nova.compute.manager [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1101.857962] env[67820]: DEBUG oslo.service.loopingcall [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1101.858157] env[67820]: DEBUG nova.compute.manager [-] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1101.858391] env[67820]: DEBUG nova.network.neutron [-] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1101.891354] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1101.891654] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.893186] env[67820]: INFO nova.compute.claims [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1101.896835] env[67820]: DEBUG nova.network.neutron [-] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1101.912059] env[67820]: INFO nova.compute.manager [-] [instance: 939b4cbc-aa1c-4995-9675-4c3d1f4dce55] Took 0.05 seconds to deallocate network for instance. [ 1102.014703] env[67820]: DEBUG oslo_concurrency.lockutils [None req-03a9b658-4a5c-44be-83a0-7cf3f46001b0 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "939b4cbc-aa1c-4995-9675-4c3d1f4dce55" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1102.212446] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17191094-50f5-42f8-a8f0-ef7bb08f5b9e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.220071] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2718eeff-bfcc-422a-8bc8-cabcfb9be0ef {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.251863] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc4c58e1-d31c-419f-99a9-20c6e03e5cd2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.258983] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0b49c66-59ef-4b0e-bdc7-33e7c49707e9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.272116] env[67820]: DEBUG nova.compute.provider_tree [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1102.280448] env[67820]: DEBUG nova.scheduler.client.report [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1102.296156] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.404s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1102.296639] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1102.340709] env[67820]: DEBUG nova.compute.utils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1102.342578] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1102.342578] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1102.351466] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1102.399655] env[67820]: DEBUG nova.policy [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1fdcd371f66742e2b8a56846e91e62aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7767a564247b405b92073629bffda753', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1102.422732] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1102.452253] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1102.452531] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1102.452700] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1102.452916] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1102.453110] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1102.453182] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1102.453371] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1102.453528] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1102.453697] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1102.453858] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1102.454040] env[67820]: DEBUG nova.virt.hardware [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1102.454903] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed9ae583-66a9-4c6c-bc82-a2af5440453c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.463706] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ff19fd-965a-4b1e-a727-231a8e5854a8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.726935] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Successfully created port: 2adefe1a-e108-4e5a-881b-836931dd0f61 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1103.405697] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Successfully updated port: 2adefe1a-e108-4e5a-881b-836931dd0f61 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1103.416060] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "refresh_cache-04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1103.416200] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "refresh_cache-04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1103.416400] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1103.454822] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1103.702630] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Updating instance_info_cache with network_info: [{"id": "2adefe1a-e108-4e5a-881b-836931dd0f61", "address": "fa:16:3e:63:c9:4f", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2adefe1a-e1", "ovs_interfaceid": "2adefe1a-e108-4e5a-881b-836931dd0f61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1103.712205] env[67820]: DEBUG nova.compute.manager [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Received event network-vif-plugged-2adefe1a-e108-4e5a-881b-836931dd0f61 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1103.712514] env[67820]: DEBUG oslo_concurrency.lockutils [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] Acquiring lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1103.712639] env[67820]: DEBUG oslo_concurrency.lockutils [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1103.712842] env[67820]: DEBUG oslo_concurrency.lockutils [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1103.712953] env[67820]: DEBUG nova.compute.manager [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] No waiting events found dispatching network-vif-plugged-2adefe1a-e108-4e5a-881b-836931dd0f61 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1103.713127] env[67820]: WARNING nova.compute.manager [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Received unexpected event network-vif-plugged-2adefe1a-e108-4e5a-881b-836931dd0f61 for instance with vm_state building and task_state spawning. [ 1103.713284] env[67820]: DEBUG nova.compute.manager [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Received event network-changed-2adefe1a-e108-4e5a-881b-836931dd0f61 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1103.713433] env[67820]: DEBUG nova.compute.manager [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Refreshing instance network info cache due to event network-changed-2adefe1a-e108-4e5a-881b-836931dd0f61. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1103.713591] env[67820]: DEBUG oslo_concurrency.lockutils [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] Acquiring lock "refresh_cache-04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1103.715208] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "refresh_cache-04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1103.715469] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Instance network_info: |[{"id": "2adefe1a-e108-4e5a-881b-836931dd0f61", "address": "fa:16:3e:63:c9:4f", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2adefe1a-e1", "ovs_interfaceid": "2adefe1a-e108-4e5a-881b-836931dd0f61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1103.715737] env[67820]: DEBUG oslo_concurrency.lockutils [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] Acquired lock "refresh_cache-04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1103.715903] env[67820]: DEBUG nova.network.neutron [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Refreshing network info cache for port 2adefe1a-e108-4e5a-881b-836931dd0f61 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1103.716925] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:63:c9:4f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '40c947c4-f471-4d48-8e43-fee54198107e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2adefe1a-e108-4e5a-881b-836931dd0f61', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1103.724387] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating folder: Project (7767a564247b405b92073629bffda753). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1103.726890] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2ef9d51-b8b7-43ef-baa8-48e9148fa7b2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.739928] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Created folder: Project (7767a564247b405b92073629bffda753) in parent group-v692668. [ 1103.740132] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating folder: Instances. Parent ref: group-v692725. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1103.740363] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-54ae6709-4686-43b1-9501-95528c0d938b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.748665] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Created folder: Instances in parent group-v692725. [ 1103.748885] env[67820]: DEBUG oslo.service.loopingcall [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1103.749072] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1103.749270] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2458081f-8f68-4379-9a64-85239a28df73 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1103.769436] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1103.769436] env[67820]: value = "task-3467367" [ 1103.769436] env[67820]: _type = "Task" [ 1103.769436] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1103.776935] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467367, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1103.986289] env[67820]: DEBUG nova.network.neutron [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Updated VIF entry in instance network info cache for port 2adefe1a-e108-4e5a-881b-836931dd0f61. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1103.986666] env[67820]: DEBUG nova.network.neutron [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Updating instance_info_cache with network_info: [{"id": "2adefe1a-e108-4e5a-881b-836931dd0f61", "address": "fa:16:3e:63:c9:4f", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2adefe1a-e1", "ovs_interfaceid": "2adefe1a-e108-4e5a-881b-836931dd0f61", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1103.996764] env[67820]: DEBUG oslo_concurrency.lockutils [req-14ac5f15-a3a0-466b-af10-217c752caba8 req-c20f29b2-d494-4d6c-ba8b-05ed727fe0d1 service nova] Releasing lock "refresh_cache-04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1104.279976] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467367, 'name': CreateVM_Task, 'duration_secs': 0.282905} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1104.280169] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1104.280858] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1104.281043] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1104.281424] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1104.281652] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-599a3ce9-f453-45fd-a7ce-a64aa393d215 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.285902] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1104.285902] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52f6ffd3-5922-7fda-46e1-0da7a362a5ed" [ 1104.285902] env[67820]: _type = "Task" [ 1104.285902] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1104.293170] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52f6ffd3-5922-7fda-46e1-0da7a362a5ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1104.796913] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1104.797217] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1104.797446] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1107.163186] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1107.163186] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1107.213859] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.621760] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1116.622120] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1116.633372] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] There are 0 instances to clean {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1117.621500] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1117.621714] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances with incomplete migration {{(pid=67820) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1117.630857] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1119.637589] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.621978] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.621978] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1123.617847] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.620991] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.621322] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1124.621322] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1124.641343] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.641556] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.641708] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.641837] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.641960] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.642088] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.642206] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.642320] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.642434] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.642544] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1124.642706] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1124.643144] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.643312] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.643473] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1126.638962] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1128.623015] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1128.634766] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1128.634951] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1128.635140] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1128.635296] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1128.636389] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc2ad71b-f4d6-486f-825b-d0cb8ec0ad0f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.645377] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2400cae-c424-407e-a2d5-986f12e532cd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.658920] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea024a7b-2400-403e-96e6-ef071392203d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.666026] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8a4a2b2-83ba-4aa1-8765-15b7289626f7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.695967] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180890MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1128.696133] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1128.696319] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1128.797096] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.797282] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.797414] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.797537] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.797655] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.797772] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.797886] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.798010] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.798129] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.798241] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1128.810268] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.820727] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance df369452-6ff5-4d06-98d3-edf0824a685b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.830652] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0255953f-2dd2-48fa-9030-9aa96a61c504 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.839781] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 7048bf62-9134-47f6-9638-e8911bf85e17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.850594] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3349a87f-da82-4990-ad15-7cb0fd446ec7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.859669] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c4413662-f234-4e54-8054-41c655c6412e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.868917] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.877768] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3873b861-181a-4242-a194-dca1f37f8715 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.886839] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 49a6ccef-af81-4177-93f3-0581c86242c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.895942] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.905064] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1128.905289] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1128.905433] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1128.921195] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing inventories for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1128.935162] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating ProviderTree inventory for provider 0f792661-ec04-4fc2-898f-e9860339eddd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1128.935348] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1128.947568] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing aggregate associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, aggregates: None {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1128.963937] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing trait associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, traits: COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1129.179965] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d4199df-54b0-4b2d-92d6-783fa7af59c4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.187552] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1693fd5-39b5-4600-b4cc-75b36c8e9087 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.217686] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9b44703-f3a3-4bb0-87f6-f0778f2261d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.225563] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91dd2e16-d650-48a3-8a24-2947dbf09125 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.238738] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1129.247123] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1129.260986] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1129.261194] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.565s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1131.260092] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1139.563052] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1139.583198] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Getting list of instances from cluster (obj){ [ 1139.583198] env[67820]: value = "domain-c8" [ 1139.583198] env[67820]: _type = "ClusterComputeResource" [ 1139.583198] env[67820]: } {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1139.584454] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-336766b7-f3e5-4027-9f08-e6047afe479a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.601127] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Got total of 10 instances {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1139.601273] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid d5c358d0-46f8-4cba-9e37-34c5dfe92526 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.601488] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid ffe8063c-5dae-4e58-beca-f3a883d5d8df {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.601715] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 39420619-61c5-4e52-8236-d3abc3ef6f0f {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.601903] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 906c40dd-b6d6-492a-aa51-58901959a60d {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.602074] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 2311d6b7-32ab-45c6-83f8-9b341e847bf0 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.602229] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid d06b6984-d1d4-4afd-8ffd-f37407697d4b {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.602378] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 31ec9cab-abfb-4a73-8df8-057670201267 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.602526] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid cfc7ee69-6da9-4f70-b245-17b12674feeb {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.602704] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 45a68888-979e-4255-98a0-bcb289f57830 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.602864] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1139.603200] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.603433] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.603627] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.603824] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "906c40dd-b6d6-492a-aa51-58901959a60d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.604063] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.604275] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.604472] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "31ec9cab-abfb-4a73-8df8-057670201267" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.604662] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.604854] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "45a68888-979e-4255-98a0-bcb289f57830" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.605064] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1147.878372] env[67820]: WARNING oslo_vmware.rw_handles [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1147.878372] env[67820]: ERROR oslo_vmware.rw_handles [ 1147.878950] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1147.881354] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1147.881602] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Copying Virtual Disk [datastore1] vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/f6dcf0a3-c3d3-4f72-a138-82ba0d54b8b0/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1147.881916] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5659ae97-bd65-44f0-a601-84275aad741c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.890150] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1147.890150] env[67820]: value = "task-3467368" [ 1147.890150] env[67820]: _type = "Task" [ 1147.890150] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1147.897793] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467368, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1148.400711] env[67820]: DEBUG oslo_vmware.exceptions [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1148.401048] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1148.401600] env[67820]: ERROR nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.401600] env[67820]: Faults: ['InvalidArgument'] [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Traceback (most recent call last): [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] yield resources [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self.driver.spawn(context, instance, image_meta, [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self._fetch_image_if_missing(context, vi) [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] image_cache(vi, tmp_image_ds_loc) [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] vm_util.copy_virtual_disk( [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] session._wait_for_task(vmdk_copy_task) [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] return self.wait_for_task(task_ref) [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] return evt.wait() [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] result = hub.switch() [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] return self.greenlet.switch() [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self.f(*self.args, **self.kw) [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] raise exceptions.translate_fault(task_info.error) [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Faults: ['InvalidArgument'] [ 1148.401600] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] [ 1148.402675] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Terminating instance [ 1148.403532] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1148.403738] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1148.403988] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0804edef-266b-4fdb-ac1f-9bbc1c3ec361 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.406351] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1148.406541] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1148.407320] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b09c20e0-266b-4f79-a763-a07f74475cab {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.413908] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1148.414185] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b3543e4e-b7f5-4393-aa37-eea44a3bb3fb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.416372] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1148.416542] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1148.417518] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-388a5461-83bb-4ed8-a277-29f02524a8e1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.422610] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Waiting for the task: (returnval){ [ 1148.422610] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]521f1e21-2665-b60f-6e6a-6adcc3ea7400" [ 1148.422610] env[67820]: _type = "Task" [ 1148.422610] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1148.435430] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]521f1e21-2665-b60f-6e6a-6adcc3ea7400, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1148.479255] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1148.479546] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1148.479743] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleting the datastore file [datastore1] d5c358d0-46f8-4cba-9e37-34c5dfe92526 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1148.480052] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ad1e0ddd-69e0-444c-ac79-4ff01feee917 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.486561] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for the task: (returnval){ [ 1148.486561] env[67820]: value = "task-3467370" [ 1148.486561] env[67820]: _type = "Task" [ 1148.486561] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1148.493922] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467370, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1148.932230] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1148.932473] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Creating directory with path [datastore1] vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1148.932613] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-13b845f1-66f2-498f-a3eb-6714706bffc0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.943725] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Created directory with path [datastore1] vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1148.943934] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Fetch image to [datastore1] vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1148.944119] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1148.944867] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a6315fe-9cb0-491f-92fa-3fed33fdae62 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.952802] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a425f6b-1666-44c5-bb9b-403a1e9ddcd0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.961613] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd3b88e4-d810-406e-b98f-31065bc33822 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1148.995653] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6710648-bfcf-4b0d-889c-27362c179a89 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.002486] env[67820]: DEBUG oslo_vmware.api [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Task: {'id': task-3467370, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07253} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1149.003900] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1149.004098] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1149.004269] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1149.004436] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1149.006220] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-12bc728a-454e-40de-b8d1-5af141d4d061 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.008088] env[67820]: DEBUG nova.compute.claims [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1149.008261] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1149.008465] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1149.030778] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1149.087881] env[67820]: DEBUG oslo_vmware.rw_handles [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1149.148182] env[67820]: DEBUG oslo_vmware.rw_handles [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1149.148383] env[67820]: DEBUG oslo_vmware.rw_handles [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1149.338544] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5831ef73-437f-4a2d-b2fa-f418dd0e0acc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.346464] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bed38dd6-a202-4396-b922-b39f5dbef38f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.376812] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27609b6d-4279-4ab1-ac85-ed80782fb060 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.383582] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f91e43b-8da9-4a39-a530-473a56459a73 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.396157] env[67820]: DEBUG nova.compute.provider_tree [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1149.404324] env[67820]: DEBUG nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1149.417457] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.409s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1149.417948] env[67820]: ERROR nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1149.417948] env[67820]: Faults: ['InvalidArgument'] [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Traceback (most recent call last): [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self.driver.spawn(context, instance, image_meta, [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self._fetch_image_if_missing(context, vi) [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] image_cache(vi, tmp_image_ds_loc) [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] vm_util.copy_virtual_disk( [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] session._wait_for_task(vmdk_copy_task) [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] return self.wait_for_task(task_ref) [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] return evt.wait() [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] result = hub.switch() [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] return self.greenlet.switch() [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] self.f(*self.args, **self.kw) [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] raise exceptions.translate_fault(task_info.error) [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Faults: ['InvalidArgument'] [ 1149.417948] env[67820]: ERROR nova.compute.manager [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] [ 1149.418873] env[67820]: DEBUG nova.compute.utils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1149.419888] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Build of instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 was re-scheduled: A specified parameter was not correct: fileType [ 1149.419888] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1149.420274] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1149.420440] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1149.420610] env[67820]: DEBUG nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1149.420770] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1149.754251] env[67820]: DEBUG nova.network.neutron [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1149.767066] env[67820]: INFO nova.compute.manager [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Took 0.35 seconds to deallocate network for instance. [ 1149.864717] env[67820]: INFO nova.scheduler.client.report [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Deleted allocations for instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 [ 1149.890723] env[67820]: DEBUG oslo_concurrency.lockutils [None req-83f78bf1-7eae-477d-804d-8f372b5f0f58 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 569.548s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1149.891918] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 368.173s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1149.892169] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Acquiring lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1149.892374] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1149.892543] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1149.895071] env[67820]: INFO nova.compute.manager [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Terminating instance [ 1149.896697] env[67820]: DEBUG nova.compute.manager [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1149.896892] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1149.897389] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7663458e-8277-43b6-b4a5-3e6355c04e78 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.908157] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39f830e9-4d62-40ac-ab0e-7c8da08bca1b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1149.919862] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1149.939643] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d5c358d0-46f8-4cba-9e37-34c5dfe92526 could not be found. [ 1149.939920] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1149.939985] env[67820]: INFO nova.compute.manager [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1149.940272] env[67820]: DEBUG oslo.service.loopingcall [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1149.940526] env[67820]: DEBUG nova.compute.manager [-] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1149.940654] env[67820]: DEBUG nova.network.neutron [-] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1149.963970] env[67820]: DEBUG nova.network.neutron [-] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1149.972757] env[67820]: INFO nova.compute.manager [-] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] Took 0.03 seconds to deallocate network for instance. [ 1149.975419] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1149.975638] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1149.979025] env[67820]: INFO nova.compute.claims [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1150.080763] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7883cfd6-2a77-4142-ae97-b204a8788173 tempest-ListServersNegativeTestJSON-653087124 tempest-ListServersNegativeTestJSON-653087124-project-member] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.188s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1150.081234] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 10.478s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1150.081433] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d5c358d0-46f8-4cba-9e37-34c5dfe92526] During sync_power_state the instance has a pending task (deleting). Skip. [ 1150.081600] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "d5c358d0-46f8-4cba-9e37-34c5dfe92526" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1150.263638] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27612dc8-7fbc-4c28-885b-c0f338ac12e6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.271416] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-942356f4-b74e-44fc-9d70-45fee5b0f49f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.301034] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d829cc85-d5a3-48f5-97ef-4df2a89c4f32 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.308234] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e8eef5e-e427-4316-b013-0610248896c2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.321245] env[67820]: DEBUG nova.compute.provider_tree [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1150.330488] env[67820]: DEBUG nova.scheduler.client.report [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1150.347632] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1150.348185] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1150.383472] env[67820]: DEBUG nova.compute.utils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1150.385031] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1150.385207] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1150.395729] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1150.450464] env[67820]: DEBUG nova.policy [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3725ebc5bd924d84b0206235ea107e5d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30492c9980ff449ab0f8cda825bece11', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1150.457170] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1150.482523] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1150.482760] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1150.482931] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1150.483139] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1150.483287] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1150.483430] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1150.483633] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1150.483790] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1150.483959] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1150.484144] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1150.484320] env[67820]: DEBUG nova.virt.hardware [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1150.485168] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f75db4a7-aa4f-4fb8-875d-22a8fd80740f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.492998] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ee597a1-7725-4b32-930a-e554b5517898 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.738581] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Successfully created port: b2d85c1e-2f6e-445f-a8b8-54c6657b660a {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1151.558094] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Successfully updated port: b2d85c1e-2f6e-445f-a8b8-54c6657b660a {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1151.574273] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "refresh_cache-9d6e6061-056f-4d2d-9860-22f154edc9ab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1151.574273] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquired lock "refresh_cache-9d6e6061-056f-4d2d-9860-22f154edc9ab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1151.574273] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1151.612855] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1151.777665] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Updating instance_info_cache with network_info: [{"id": "b2d85c1e-2f6e-445f-a8b8-54c6657b660a", "address": "fa:16:3e:5f:ee:dd", "network": {"id": "e564e9f8-1ab6-4fef-9c21-a4d5c78c7ecd", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-75125556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30492c9980ff449ab0f8cda825bece11", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae215ba8-f7a5-4b23-a055-90316d29817f", "external-id": "nsx-vlan-transportzone-798", "segmentation_id": 798, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2d85c1e-2f", "ovs_interfaceid": "b2d85c1e-2f6e-445f-a8b8-54c6657b660a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1151.790636] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Releasing lock "refresh_cache-9d6e6061-056f-4d2d-9860-22f154edc9ab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1151.790961] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Instance network_info: |[{"id": "b2d85c1e-2f6e-445f-a8b8-54c6657b660a", "address": "fa:16:3e:5f:ee:dd", "network": {"id": "e564e9f8-1ab6-4fef-9c21-a4d5c78c7ecd", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-75125556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30492c9980ff449ab0f8cda825bece11", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae215ba8-f7a5-4b23-a055-90316d29817f", "external-id": "nsx-vlan-transportzone-798", "segmentation_id": 798, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2d85c1e-2f", "ovs_interfaceid": "b2d85c1e-2f6e-445f-a8b8-54c6657b660a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1151.791361] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5f:ee:dd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae215ba8-f7a5-4b23-a055-90316d29817f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b2d85c1e-2f6e-445f-a8b8-54c6657b660a', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1151.799297] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Creating folder: Project (30492c9980ff449ab0f8cda825bece11). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1151.799837] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fabf189b-80a0-4a6b-aefc-a17c837a3dee {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.812023] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Created folder: Project (30492c9980ff449ab0f8cda825bece11) in parent group-v692668. [ 1151.812023] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Creating folder: Instances. Parent ref: group-v692728. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1151.812155] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dd1332d2-7f66-4fa5-982e-8d145b1113cb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.821364] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Created folder: Instances in parent group-v692728. [ 1151.821590] env[67820]: DEBUG oslo.service.loopingcall [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1151.821824] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1151.822037] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-19848b5a-2f3d-4185-b16a-f4337721e812 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.838683] env[67820]: DEBUG nova.compute.manager [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Received event network-vif-plugged-b2d85c1e-2f6e-445f-a8b8-54c6657b660a {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1151.838907] env[67820]: DEBUG oslo_concurrency.lockutils [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] Acquiring lock "9d6e6061-056f-4d2d-9860-22f154edc9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.839129] env[67820]: DEBUG oslo_concurrency.lockutils [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1151.839293] env[67820]: DEBUG oslo_concurrency.lockutils [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1151.839462] env[67820]: DEBUG nova.compute.manager [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] No waiting events found dispatching network-vif-plugged-b2d85c1e-2f6e-445f-a8b8-54c6657b660a {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1151.839669] env[67820]: WARNING nova.compute.manager [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Received unexpected event network-vif-plugged-b2d85c1e-2f6e-445f-a8b8-54c6657b660a for instance with vm_state building and task_state spawning. [ 1151.839833] env[67820]: DEBUG nova.compute.manager [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Received event network-changed-b2d85c1e-2f6e-445f-a8b8-54c6657b660a {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1151.839984] env[67820]: DEBUG nova.compute.manager [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Refreshing instance network info cache due to event network-changed-b2d85c1e-2f6e-445f-a8b8-54c6657b660a. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1151.840179] env[67820]: DEBUG oslo_concurrency.lockutils [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] Acquiring lock "refresh_cache-9d6e6061-056f-4d2d-9860-22f154edc9ab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1151.840312] env[67820]: DEBUG oslo_concurrency.lockutils [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] Acquired lock "refresh_cache-9d6e6061-056f-4d2d-9860-22f154edc9ab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1151.840464] env[67820]: DEBUG nova.network.neutron [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Refreshing network info cache for port b2d85c1e-2f6e-445f-a8b8-54c6657b660a {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1151.846363] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1151.846363] env[67820]: value = "task-3467373" [ 1151.846363] env[67820]: _type = "Task" [ 1151.846363] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1151.860439] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467373, 'name': CreateVM_Task} progress is 5%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1152.104045] env[67820]: DEBUG nova.network.neutron [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Updated VIF entry in instance network info cache for port b2d85c1e-2f6e-445f-a8b8-54c6657b660a. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1152.104045] env[67820]: DEBUG nova.network.neutron [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Updating instance_info_cache with network_info: [{"id": "b2d85c1e-2f6e-445f-a8b8-54c6657b660a", "address": "fa:16:3e:5f:ee:dd", "network": {"id": "e564e9f8-1ab6-4fef-9c21-a4d5c78c7ecd", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-75125556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30492c9980ff449ab0f8cda825bece11", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae215ba8-f7a5-4b23-a055-90316d29817f", "external-id": "nsx-vlan-transportzone-798", "segmentation_id": 798, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2d85c1e-2f", "ovs_interfaceid": "b2d85c1e-2f6e-445f-a8b8-54c6657b660a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1152.112059] env[67820]: DEBUG oslo_concurrency.lockutils [req-4639b809-2e6c-4c50-9cb4-4504af7836f1 req-f378f83e-675e-4183-9ecd-443ef8eb79fe service nova] Releasing lock "refresh_cache-9d6e6061-056f-4d2d-9860-22f154edc9ab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1152.357655] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467373, 'name': CreateVM_Task, 'duration_secs': 0.296764} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1152.357845] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1152.358476] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1152.358643] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1152.358966] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1152.359249] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-06271031-e09f-4a37-93e5-6375b65c1ac4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.364192] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Waiting for the task: (returnval){ [ 1152.364192] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]520c4292-2d72-aab8-f6bb-1a826fb1d29a" [ 1152.364192] env[67820]: _type = "Task" [ 1152.364192] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1152.375750] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]520c4292-2d72-aab8-f6bb-1a826fb1d29a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1152.876123] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1152.876386] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1152.876534] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1156.806655] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "9d6e6061-056f-4d2d-9860-22f154edc9ab" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.154026] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.154026] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1180.168735] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1180.169047] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1180.664351] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.621299] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.622136] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1185.621609] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1185.622043] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1186.616583] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1186.621319] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1186.621475] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1186.621656] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1186.647556] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.647705] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.647833] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.647959] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.648093] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.648227] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.648343] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.648460] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.648575] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.648691] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1186.648806] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1186.649323] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1188.621501] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1188.634327] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1188.634401] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1188.634547] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1188.634704] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1188.635863] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a88e1f6c-fb96-4464-ad37-ff5b8cff3182 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.645575] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e288bc1c-22a9-4c3f-bd13-8b1403fdf25f tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Acquiring lock "f0934337-b7e8-48ec-b30c-24c92c79267b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1188.645848] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e288bc1c-22a9-4c3f-bd13-8b1403fdf25f tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "f0934337-b7e8-48ec-b30c-24c92c79267b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1188.647492] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a990171a-e4cf-477b-b3af-b403356ea61c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.663221] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-635b5866-8f1b-43f1-8075-e8f34a30b679 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.669582] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d4193ce-a4e7-4b4c-97c8-4a84edaf9c95 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.701738] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180920MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1188.701827] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1188.702182] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.773427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1188.785727] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.796458] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3873b861-181a-4242-a194-dca1f37f8715 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.808368] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 49a6ccef-af81-4177-93f3-0581c86242c4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.817463] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.832026] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.845885] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.862197] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.874924] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f0934337-b7e8-48ec-b30c-24c92c79267b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1188.874924] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1188.874924] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1189.131578] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae1ef454-a1f2-4586-92a4-b56aa222f1c8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.139419] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7947c438-0ed2-4b0d-a811-ea322c3b6960 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.171655] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b7b2ce-b567-43cb-97cc-2f5004d032ff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.179652] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec4ef87c-f4af-4817-a36b-6c518076d595 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.194564] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1189.205970] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1189.220394] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1189.220615] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.518s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1192.221379] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1198.692233] env[67820]: WARNING oslo_vmware.rw_handles [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1198.692233] env[67820]: ERROR oslo_vmware.rw_handles [ 1198.692966] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1198.694621] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1198.694853] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Copying Virtual Disk [datastore1] vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/892b2cd9-c538-4e97-a8f8-2615ef941d1e/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1198.695543] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-22c56949-2887-4213-8393-4e25d7c5d02f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.704251] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Waiting for the task: (returnval){ [ 1198.704251] env[67820]: value = "task-3467374" [ 1198.704251] env[67820]: _type = "Task" [ 1198.704251] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1198.712332] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Task: {'id': task-3467374, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1199.215499] env[67820]: DEBUG oslo_vmware.exceptions [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1199.215783] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1199.217053] env[67820]: ERROR nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1199.217053] env[67820]: Faults: ['InvalidArgument'] [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Traceback (most recent call last): [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] yield resources [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self.driver.spawn(context, instance, image_meta, [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self._fetch_image_if_missing(context, vi) [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] image_cache(vi, tmp_image_ds_loc) [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] vm_util.copy_virtual_disk( [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] session._wait_for_task(vmdk_copy_task) [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] return self.wait_for_task(task_ref) [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] return evt.wait() [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] result = hub.switch() [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] return self.greenlet.switch() [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self.f(*self.args, **self.kw) [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] raise exceptions.translate_fault(task_info.error) [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Faults: ['InvalidArgument'] [ 1199.217053] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] [ 1199.217974] env[67820]: INFO nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Terminating instance [ 1199.218998] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1199.219901] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1199.219901] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1199.220274] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1199.220310] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a5f1e1b3-431e-48fe-a21c-7a5474b6b2a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.223039] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d4e2ce-3a87-44b6-994c-32f110927683 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.229734] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1199.229734] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3a009232-a978-4e05-b342-0685a121611b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.231832] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1199.232016] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1199.232932] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-25da4be5-2ff7-417a-aced-f280dbdc2a89 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.237615] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Waiting for the task: (returnval){ [ 1199.237615] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5265c862-8f38-53bf-15a7-732864263de3" [ 1199.237615] env[67820]: _type = "Task" [ 1199.237615] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1199.248378] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5265c862-8f38-53bf-15a7-732864263de3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1199.297602] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1199.297840] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1199.298064] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Deleting the datastore file [datastore1] ffe8063c-5dae-4e58-beca-f3a883d5d8df {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1199.298316] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b7bc661d-a68f-407e-9a9b-4579ba8b58fa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.305178] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Waiting for the task: (returnval){ [ 1199.305178] env[67820]: value = "task-3467376" [ 1199.305178] env[67820]: _type = "Task" [ 1199.305178] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1199.312817] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Task: {'id': task-3467376, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1199.749569] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1199.750219] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Creating directory with path [datastore1] vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1199.750604] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b44a4d56-6c55-4c27-a6af-b5f47014e8f9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.762993] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Created directory with path [datastore1] vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1199.763388] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Fetch image to [datastore1] vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1199.763728] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1199.764594] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92544cb0-628a-4c32-bf84-084bdcbc19f3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.771445] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc66ab8f-a8f3-4b81-a5fc-79dd27dd4058 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.780273] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-753ab30a-674b-477d-8d83-a049106a9ea2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.813200] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c52018b-b687-418b-bb8a-748368c6c53d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.820032] env[67820]: DEBUG oslo_vmware.api [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Task: {'id': task-3467376, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068493} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1199.821523] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1199.822863] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1199.823220] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1199.823592] env[67820]: INFO nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1199.825445] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-74c5473d-7ce4-4e12-832e-e70bae9c91b2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.827474] env[67820]: DEBUG nova.compute.claims [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1199.827756] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.828085] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.849501] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1200.034835] env[67820]: DEBUG oslo_vmware.rw_handles [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1200.094222] env[67820]: DEBUG oslo_vmware.rw_handles [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1200.094423] env[67820]: DEBUG oslo_vmware.rw_handles [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1200.167519] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c2a2bf4-f1f8-4909-b77d-e2438a269677 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.175385] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52deeaa1-a403-48ad-8567-37286963eff9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.206206] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df3246af-9c04-4a70-a736-f991a18e7ea0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.214087] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc3c6ea9-a9b8-4a23-9bc5-4236a97b2340 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.227353] env[67820]: DEBUG nova.compute.provider_tree [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1200.240092] env[67820]: DEBUG nova.scheduler.client.report [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1200.252735] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.425s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1200.253560] env[67820]: ERROR nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1200.253560] env[67820]: Faults: ['InvalidArgument'] [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Traceback (most recent call last): [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self.driver.spawn(context, instance, image_meta, [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self._fetch_image_if_missing(context, vi) [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] image_cache(vi, tmp_image_ds_loc) [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] vm_util.copy_virtual_disk( [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] session._wait_for_task(vmdk_copy_task) [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] return self.wait_for_task(task_ref) [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] return evt.wait() [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] result = hub.switch() [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] return self.greenlet.switch() [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] self.f(*self.args, **self.kw) [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] raise exceptions.translate_fault(task_info.error) [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Faults: ['InvalidArgument'] [ 1200.253560] env[67820]: ERROR nova.compute.manager [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] [ 1200.254627] env[67820]: DEBUG nova.compute.utils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1200.255760] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Build of instance ffe8063c-5dae-4e58-beca-f3a883d5d8df was re-scheduled: A specified parameter was not correct: fileType [ 1200.255760] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1200.256165] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1200.256350] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1200.256521] env[67820]: DEBUG nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1200.256685] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1200.565922] env[67820]: DEBUG oslo_concurrency.lockutils [None req-012fc784-4067-4078-860c-cc16e44e0cb4 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "4432bc55-aadb-4c3b-8b15-28edfbb40d66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1200.567396] env[67820]: DEBUG oslo_concurrency.lockutils [None req-012fc784-4067-4078-860c-cc16e44e0cb4 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "4432bc55-aadb-4c3b-8b15-28edfbb40d66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1200.714619] env[67820]: DEBUG nova.network.neutron [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1200.728960] env[67820]: INFO nova.compute.manager [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Took 0.47 seconds to deallocate network for instance. [ 1200.822607] env[67820]: INFO nova.scheduler.client.report [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Deleted allocations for instance ffe8063c-5dae-4e58-beca-f3a883d5d8df [ 1200.849022] env[67820]: DEBUG oslo_concurrency.lockutils [None req-de370e27-aedb-4258-a2fd-eba248577351 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 611.660s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1200.850226] env[67820]: DEBUG oslo_concurrency.lockutils [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 413.475s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1200.850411] env[67820]: DEBUG oslo_concurrency.lockutils [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Acquiring lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1200.850593] env[67820]: DEBUG oslo_concurrency.lockutils [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1200.850760] env[67820]: DEBUG oslo_concurrency.lockutils [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1200.852814] env[67820]: INFO nova.compute.manager [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Terminating instance [ 1200.854662] env[67820]: DEBUG nova.compute.manager [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1200.854754] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1200.855409] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4fe9d595-2b75-4ee5-9070-96c48c41ab69 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.864703] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74b6f87-728c-44ff-b91c-dbebf99be802 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.875739] env[67820]: DEBUG nova.compute.manager [None req-08636822-4130-408a-b62d-184bdcbafa1d tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] [instance: df369452-6ff5-4d06-98d3-edf0824a685b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1200.897765] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ffe8063c-5dae-4e58-beca-f3a883d5d8df could not be found. [ 1200.898028] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1200.898156] env[67820]: INFO nova.compute.manager [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1200.898400] env[67820]: DEBUG oslo.service.loopingcall [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1200.898625] env[67820]: DEBUG nova.compute.manager [-] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1200.898719] env[67820]: DEBUG nova.network.neutron [-] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1200.905238] env[67820]: DEBUG nova.compute.manager [None req-08636822-4130-408a-b62d-184bdcbafa1d tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] [instance: df369452-6ff5-4d06-98d3-edf0824a685b] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1200.922354] env[67820]: DEBUG nova.network.neutron [-] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1200.925821] env[67820]: DEBUG oslo_concurrency.lockutils [None req-08636822-4130-408a-b62d-184bdcbafa1d tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Lock "df369452-6ff5-4d06-98d3-edf0824a685b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.544s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1200.929601] env[67820]: INFO nova.compute.manager [-] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] Took 0.03 seconds to deallocate network for instance. [ 1200.935213] env[67820]: DEBUG nova.compute.manager [None req-2d8fd0b7-bed3-4f17-a73b-0e17a572f826 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] [instance: 0255953f-2dd2-48fa-9030-9aa96a61c504] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1200.956873] env[67820]: DEBUG nova.compute.manager [None req-2d8fd0b7-bed3-4f17-a73b-0e17a572f826 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] [instance: 0255953f-2dd2-48fa-9030-9aa96a61c504] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1200.979468] env[67820]: DEBUG oslo_concurrency.lockutils [None req-2d8fd0b7-bed3-4f17-a73b-0e17a572f826 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Lock "0255953f-2dd2-48fa-9030-9aa96a61c504" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.018s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1200.988739] env[67820]: DEBUG nova.compute.manager [None req-b53a6a26-f764-4101-a709-314d85770721 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] [instance: 7048bf62-9134-47f6-9638-e8911bf85e17] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1201.016524] env[67820]: DEBUG nova.compute.manager [None req-b53a6a26-f764-4101-a709-314d85770721 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] [instance: 7048bf62-9134-47f6-9638-e8911bf85e17] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1201.025219] env[67820]: DEBUG oslo_concurrency.lockutils [None req-039ad119-1dc6-408c-9c84-70742a0bb5d4 tempest-ServersV294TestFqdnHostnames-610327644 tempest-ServersV294TestFqdnHostnames-610327644-project-member] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.026297] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 61.423s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1201.026488] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: ffe8063c-5dae-4e58-beca-f3a883d5d8df] During sync_power_state the instance has a pending task (deleting). Skip. [ 1201.026650] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "ffe8063c-5dae-4e58-beca-f3a883d5d8df" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.037848] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b53a6a26-f764-4101-a709-314d85770721 tempest-ListServerFiltersTestJSON-177507290 tempest-ListServerFiltersTestJSON-177507290-project-member] Lock "7048bf62-9134-47f6-9638-e8911bf85e17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.458s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.047987] env[67820]: DEBUG nova.compute.manager [None req-f5fbc878-9593-4a76-a007-c3e81a17e59d tempest-ServerActionsTestJSON-1929179909 tempest-ServerActionsTestJSON-1929179909-project-member] [instance: 3349a87f-da82-4990-ad15-7cb0fd446ec7] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1201.072624] env[67820]: DEBUG nova.compute.manager [None req-f5fbc878-9593-4a76-a007-c3e81a17e59d tempest-ServerActionsTestJSON-1929179909 tempest-ServerActionsTestJSON-1929179909-project-member] [instance: 3349a87f-da82-4990-ad15-7cb0fd446ec7] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1201.092478] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f5fbc878-9593-4a76-a007-c3e81a17e59d tempest-ServerActionsTestJSON-1929179909 tempest-ServerActionsTestJSON-1929179909-project-member] Lock "3349a87f-da82-4990-ad15-7cb0fd446ec7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.414s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.101831] env[67820]: DEBUG nova.compute.manager [None req-a23b607f-586a-4a18-b887-6a0c5b2f221d tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: c4413662-f234-4e54-8054-41c655c6412e] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1201.125093] env[67820]: DEBUG nova.compute.manager [None req-a23b607f-586a-4a18-b887-6a0c5b2f221d tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: c4413662-f234-4e54-8054-41c655c6412e] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1201.144732] env[67820]: DEBUG oslo_concurrency.lockutils [None req-a23b607f-586a-4a18-b887-6a0c5b2f221d tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "c4413662-f234-4e54-8054-41c655c6412e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.960s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.155064] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1201.209424] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1201.209697] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1201.211268] env[67820]: INFO nova.compute.claims [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1201.490047] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-012e583a-e15f-4c3f-8e0a-6a4de3658e2d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.497811] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ed537b1-66ea-4f40-ad3b-938ddf3fb69f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.528627] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8855aa4d-d7f3-481f-8546-ae42f267e672 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.536439] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb941b38-d0e9-49ce-9c72-834f4f5c48ce {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.550304] env[67820]: DEBUG nova.compute.provider_tree [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1201.558831] env[67820]: DEBUG nova.scheduler.client.report [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1201.572320] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.572776] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1201.605424] env[67820]: DEBUG nova.compute.utils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1201.606234] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1201.606402] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1201.614563] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1201.679505] env[67820]: DEBUG nova.policy [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6736809e89de49eb8a1956d5eab482b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f0263432284c939246e0af0d7932dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1201.682659] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1201.707174] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1201.707428] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1201.707584] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1201.707763] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1201.707908] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1201.708091] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1201.708316] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1201.708475] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1201.708638] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1201.708796] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1201.708963] env[67820]: DEBUG nova.virt.hardware [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1201.709817] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ba8d10-51fc-4061-b553-c19ff3003ad2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1201.718007] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9afca314-f2b6-4657-93fd-ce8499fdd5dd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.047976] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Successfully created port: 8950df0d-a7ac-494e-8e04-1e778a2a0692 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1202.700866] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Successfully updated port: 8950df0d-a7ac-494e-8e04-1e778a2a0692 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1202.713515] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "refresh_cache-3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1202.713660] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquired lock "refresh_cache-3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1202.713805] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1202.764854] env[67820]: DEBUG nova.compute.manager [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Received event network-vif-plugged-8950df0d-a7ac-494e-8e04-1e778a2a0692 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1202.765094] env[67820]: DEBUG oslo_concurrency.lockutils [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] Acquiring lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1202.765307] env[67820]: DEBUG oslo_concurrency.lockutils [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1202.765469] env[67820]: DEBUG oslo_concurrency.lockutils [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1202.765629] env[67820]: DEBUG nova.compute.manager [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] No waiting events found dispatching network-vif-plugged-8950df0d-a7ac-494e-8e04-1e778a2a0692 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1202.765787] env[67820]: WARNING nova.compute.manager [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Received unexpected event network-vif-plugged-8950df0d-a7ac-494e-8e04-1e778a2a0692 for instance with vm_state building and task_state spawning. [ 1202.765941] env[67820]: DEBUG nova.compute.manager [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Received event network-changed-8950df0d-a7ac-494e-8e04-1e778a2a0692 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1202.766311] env[67820]: DEBUG nova.compute.manager [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Refreshing instance network info cache due to event network-changed-8950df0d-a7ac-494e-8e04-1e778a2a0692. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1202.766479] env[67820]: DEBUG oslo_concurrency.lockutils [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] Acquiring lock "refresh_cache-3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1202.767317] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1202.924796] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Updating instance_info_cache with network_info: [{"id": "8950df0d-a7ac-494e-8e04-1e778a2a0692", "address": "fa:16:3e:84:b4:43", "network": {"id": "a9abde67-6026-4626-94d3-8faeace735a2", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-992451309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b7f0263432284c939246e0af0d7932dc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc0e97b-b21d-4557-a4d4-fd7e8f973368", "external-id": "nsx-vlan-transportzone-380", "segmentation_id": 380, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8950df0d-a7", "ovs_interfaceid": "8950df0d-a7ac-494e-8e04-1e778a2a0692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1202.937853] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Releasing lock "refresh_cache-3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1202.938155] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Instance network_info: |[{"id": "8950df0d-a7ac-494e-8e04-1e778a2a0692", "address": "fa:16:3e:84:b4:43", "network": {"id": "a9abde67-6026-4626-94d3-8faeace735a2", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-992451309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b7f0263432284c939246e0af0d7932dc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc0e97b-b21d-4557-a4d4-fd7e8f973368", "external-id": "nsx-vlan-transportzone-380", "segmentation_id": 380, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8950df0d-a7", "ovs_interfaceid": "8950df0d-a7ac-494e-8e04-1e778a2a0692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1202.938443] env[67820]: DEBUG oslo_concurrency.lockutils [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] Acquired lock "refresh_cache-3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1202.938614] env[67820]: DEBUG nova.network.neutron [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Refreshing network info cache for port 8950df0d-a7ac-494e-8e04-1e778a2a0692 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1202.939589] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:84:b4:43', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ccc0e97b-b21d-4557-a4d4-fd7e8f973368', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8950df0d-a7ac-494e-8e04-1e778a2a0692', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1202.947456] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Creating folder: Project (b7f0263432284c939246e0af0d7932dc). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1202.950153] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-074412b8-9d9c-4019-9b98-b7b4e208b303 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.960638] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Created folder: Project (b7f0263432284c939246e0af0d7932dc) in parent group-v692668. [ 1202.960819] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Creating folder: Instances. Parent ref: group-v692731. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1202.961049] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a6567f8b-7dbd-4a6c-9ff2-06c5fcca9c75 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.969153] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Created folder: Instances in parent group-v692731. [ 1202.969522] env[67820]: DEBUG oslo.service.loopingcall [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1202.969522] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1202.969696] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8bae2508-4bbc-470c-bdd1-bea4b3554a53 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1202.989752] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1202.989752] env[67820]: value = "task-3467379" [ 1202.989752] env[67820]: _type = "Task" [ 1202.989752] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1202.997065] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467379, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1203.420844] env[67820]: DEBUG nova.network.neutron [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Updated VIF entry in instance network info cache for port 8950df0d-a7ac-494e-8e04-1e778a2a0692. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1203.421234] env[67820]: DEBUG nova.network.neutron [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Updating instance_info_cache with network_info: [{"id": "8950df0d-a7ac-494e-8e04-1e778a2a0692", "address": "fa:16:3e:84:b4:43", "network": {"id": "a9abde67-6026-4626-94d3-8faeace735a2", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-992451309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b7f0263432284c939246e0af0d7932dc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc0e97b-b21d-4557-a4d4-fd7e8f973368", "external-id": "nsx-vlan-transportzone-380", "segmentation_id": 380, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8950df0d-a7", "ovs_interfaceid": "8950df0d-a7ac-494e-8e04-1e778a2a0692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1203.430623] env[67820]: DEBUG oslo_concurrency.lockutils [req-c26a4155-4ea9-4c9d-8840-0a686240084a req-a18e6771-99d6-4dd9-9f8e-d64adc2de270 service nova] Releasing lock "refresh_cache-3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1203.499655] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467379, 'name': CreateVM_Task, 'duration_secs': 0.278865} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1203.499814] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1203.506350] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1203.506519] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1203.506833] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1203.507126] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-824543fa-07ce-44b7-ba82-f306075b843f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1203.511248] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Waiting for the task: (returnval){ [ 1203.511248] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52b510e9-add5-31b7-94e4-396ca4af7bb6" [ 1203.511248] env[67820]: _type = "Task" [ 1203.511248] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1203.521748] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52b510e9-add5-31b7-94e4-396ca4af7bb6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1204.021916] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1204.022162] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1204.022386] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1214.983967] env[67820]: DEBUG oslo_concurrency.lockutils [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1241.621196] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1242.039548] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1242.039858] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1244.616619] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.621343] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.621692] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.621967] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1247.618358] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1247.621048] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1247.621152] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1248.082984] env[67820]: WARNING oslo_vmware.rw_handles [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1248.082984] env[67820]: ERROR oslo_vmware.rw_handles [ 1248.083532] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1248.086694] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1248.086932] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Copying Virtual Disk [datastore1] vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/55de0090-bbc0-408b-a0ae-32559cc51c0f/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1248.087220] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1978b44a-4665-4e08-ae57-d7a7832ca6fb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.095720] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Waiting for the task: (returnval){ [ 1248.095720] env[67820]: value = "task-3467380" [ 1248.095720] env[67820]: _type = "Task" [ 1248.095720] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1248.103817] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Task: {'id': task-3467380, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1248.605945] env[67820]: DEBUG oslo_vmware.exceptions [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1248.606250] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1248.606816] env[67820]: ERROR nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1248.606816] env[67820]: Faults: ['InvalidArgument'] [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Traceback (most recent call last): [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] yield resources [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self.driver.spawn(context, instance, image_meta, [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self._fetch_image_if_missing(context, vi) [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] image_cache(vi, tmp_image_ds_loc) [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] vm_util.copy_virtual_disk( [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] session._wait_for_task(vmdk_copy_task) [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] return self.wait_for_task(task_ref) [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] return evt.wait() [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] result = hub.switch() [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] return self.greenlet.switch() [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self.f(*self.args, **self.kw) [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] raise exceptions.translate_fault(task_info.error) [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Faults: ['InvalidArgument'] [ 1248.606816] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] [ 1248.607806] env[67820]: INFO nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Terminating instance [ 1248.608682] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1248.608908] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1248.609171] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd2c7776-57bb-40e0-8bf9-1055c32fa8d9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.612032] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1248.612233] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1248.612982] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46bc753f-d278-4037-824a-68f62c1ad99d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.619400] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1248.619696] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7582060e-5480-41e5-8a1c-32151c3edc3b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.621849] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1248.621999] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1248.622130] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1248.623910] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1248.624218] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1248.625176] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb7a854d-8269-4a86-a6fb-c8bab2416f13 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.630047] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Waiting for the task: (returnval){ [ 1248.630047] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]521bdfa5-abcf-c494-32de-4e100e6a5fce" [ 1248.630047] env[67820]: _type = "Task" [ 1248.630047] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1248.637013] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]521bdfa5-abcf-c494-32de-4e100e6a5fce, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1248.643618] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.643767] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.643896] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.644141] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.644316] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.644444] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.644565] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.644685] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.644810] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.644917] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1248.645045] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1248.645525] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1248.656790] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.657109] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1248.657360] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1248.657594] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1248.658756] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c10fbda4-a17d-42c9-bbc9-51caaaf8d0f7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.666447] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0dccb4a-2e97-4009-9cfa-19ee111e103d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.682247] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46f2a3ea-e7be-4777-867e-3132d04f7da4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.689119] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89016037-f6a1-4e37-85da-22a36ded1830 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.694323] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1248.694536] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1248.694709] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Deleting the datastore file [datastore1] 39420619-61c5-4e52-8236-d3abc3ef6f0f {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1248.694929] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9b233f29-539a-4008-9019-57c4b7e5166a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.720969] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180904MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1248.721995] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.721995] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1248.726672] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Waiting for the task: (returnval){ [ 1248.726672] env[67820]: value = "task-3467382" [ 1248.726672] env[67820]: _type = "Task" [ 1248.726672] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1248.733876] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Task: {'id': task-3467382, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1248.795248] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.795468] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 906c40dd-b6d6-492a-aa51-58901959a60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.795550] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.795674] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.795794] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.795914] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.796043] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.796161] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.796273] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.796384] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1248.808533] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.819072] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.829395] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.841059] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.850935] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f0934337-b7e8-48ec-b30c-24c92c79267b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.866266] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4432bc55-aadb-4c3b-8b15-28edfbb40d66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.876669] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1248.876669] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1248.876669] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1249.098157] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-014a996f-f6f6-4c7d-8ddc-31f9e76ca77e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.105428] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5043fe8-effe-4130-b687-0d558ba1aa91 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.138548] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a852b991-83cd-4e35-8185-fa91ff08efe2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.148589] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49cc2113-d932-403e-bbb9-5a74a04794b3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.152206] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1249.152482] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Creating directory with path [datastore1] vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1249.152703] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-07330cf8-dfb4-47d6-bad5-5e5efdc52163 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.163641] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1249.166248] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Created directory with path [datastore1] vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1249.166440] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Fetch image to [datastore1] vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1249.166606] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1249.167306] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df39fc46-50f3-4580-bcc0-dc202bd0d973 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.174338] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1249.178618] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45f093c-aa32-432c-aac5-759ce117c7d1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.188249] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1249.188425] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.467s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1249.190069] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0f1ef7f-8e23-48ab-b12d-e8ebf20b5c48 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.221531] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f46cbe3-0fa9-4fae-a170-12f873dfa9b4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.229988] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-22c9c6b4-e643-47b3-aee9-a27cb13e349b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.236716] env[67820]: DEBUG oslo_vmware.api [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Task: {'id': task-3467382, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076037} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1249.236982] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1249.237230] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1249.237527] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1249.237746] env[67820]: INFO nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1249.240072] env[67820]: DEBUG nova.compute.claims [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1249.240214] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1249.240461] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1249.251157] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1249.465899] env[67820]: DEBUG oslo_vmware.rw_handles [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1249.525410] env[67820]: DEBUG oslo_vmware.rw_handles [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1249.525648] env[67820]: DEBUG oslo_vmware.rw_handles [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1249.533339] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ffa52f7-4b99-4693-b421-2a36782f6160 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.541220] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd994448-884e-4ebb-b3e0-f299b070192c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.570573] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9767069f-05c3-4370-bf5e-bbbfee33140a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.577405] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59690be8-4aaf-405d-86d2-14a1996b1413 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.590277] env[67820]: DEBUG nova.compute.provider_tree [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1249.598424] env[67820]: DEBUG nova.scheduler.client.report [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1249.613662] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.373s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1249.614197] env[67820]: ERROR nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1249.614197] env[67820]: Faults: ['InvalidArgument'] [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Traceback (most recent call last): [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self.driver.spawn(context, instance, image_meta, [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self._fetch_image_if_missing(context, vi) [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] image_cache(vi, tmp_image_ds_loc) [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] vm_util.copy_virtual_disk( [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] session._wait_for_task(vmdk_copy_task) [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] return self.wait_for_task(task_ref) [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] return evt.wait() [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] result = hub.switch() [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] return self.greenlet.switch() [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] self.f(*self.args, **self.kw) [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] raise exceptions.translate_fault(task_info.error) [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Faults: ['InvalidArgument'] [ 1249.614197] env[67820]: ERROR nova.compute.manager [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] [ 1249.615071] env[67820]: DEBUG nova.compute.utils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1249.616573] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Build of instance 39420619-61c5-4e52-8236-d3abc3ef6f0f was re-scheduled: A specified parameter was not correct: fileType [ 1249.616573] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1249.616934] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1249.617118] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1249.617288] env[67820]: DEBUG nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1249.617448] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1249.991490] env[67820]: DEBUG nova.network.neutron [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1250.006436] env[67820]: INFO nova.compute.manager [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Took 0.39 seconds to deallocate network for instance. [ 1250.093622] env[67820]: INFO nova.scheduler.client.report [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Deleted allocations for instance 39420619-61c5-4e52-8236-d3abc3ef6f0f [ 1250.117301] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5da925df-1900-41e1-844e-810ed282d371 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 657.221s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.118361] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 458.383s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1250.118570] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Acquiring lock "39420619-61c5-4e52-8236-d3abc3ef6f0f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1250.118782] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1250.118945] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.121214] env[67820]: INFO nova.compute.manager [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Terminating instance [ 1250.122877] env[67820]: DEBUG nova.compute.manager [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1250.123077] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1250.123594] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8f35466f-b8d8-4b42-80d0-5b952d7b26e3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.133842] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-632ba366-b0c0-423c-87d2-734a3f12bf15 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.149979] env[67820]: DEBUG nova.compute.manager [None req-dd193638-d486-4a83-a771-da2fb987578f tempest-ServerGroupTestJSON-1578419083 tempest-ServerGroupTestJSON-1578419083-project-member] [instance: 3873b861-181a-4242-a194-dca1f37f8715] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1250.162978] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 39420619-61c5-4e52-8236-d3abc3ef6f0f could not be found. [ 1250.164044] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1250.164044] env[67820]: INFO nova.compute.manager [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1250.164044] env[67820]: DEBUG oslo.service.loopingcall [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1250.164044] env[67820]: DEBUG nova.compute.manager [-] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1250.164044] env[67820]: DEBUG nova.network.neutron [-] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1250.178650] env[67820]: DEBUG nova.compute.manager [None req-dd193638-d486-4a83-a771-da2fb987578f tempest-ServerGroupTestJSON-1578419083 tempest-ServerGroupTestJSON-1578419083-project-member] [instance: 3873b861-181a-4242-a194-dca1f37f8715] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1250.206221] env[67820]: DEBUG nova.network.neutron [-] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1250.212150] env[67820]: DEBUG oslo_concurrency.lockutils [None req-dd193638-d486-4a83-a771-da2fb987578f tempest-ServerGroupTestJSON-1578419083 tempest-ServerGroupTestJSON-1578419083-project-member] Lock "3873b861-181a-4242-a194-dca1f37f8715" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.406s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.214598] env[67820]: INFO nova.compute.manager [-] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] Took 0.05 seconds to deallocate network for instance. [ 1250.243070] env[67820]: DEBUG nova.compute.manager [None req-0335e866-f4cc-4f75-9126-2eb6970d1b83 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 49a6ccef-af81-4177-93f3-0581c86242c4] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1250.267905] env[67820]: DEBUG nova.compute.manager [None req-0335e866-f4cc-4f75-9126-2eb6970d1b83 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 49a6ccef-af81-4177-93f3-0581c86242c4] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1250.287834] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0335e866-f4cc-4f75-9126-2eb6970d1b83 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "49a6ccef-af81-4177-93f3-0581c86242c4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.363s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.300341] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1250.310328] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ff291f86-c623-4640-aab3-a9ee4b867107 tempest-VolumesAssistedSnapshotsTest-178887626 tempest-VolumesAssistedSnapshotsTest-178887626-project-member] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.192s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.311212] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 110.708s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1250.311386] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 39420619-61c5-4e52-8236-d3abc3ef6f0f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1250.311588] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "39420619-61c5-4e52-8236-d3abc3ef6f0f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.350893] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1250.351169] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1250.353119] env[67820]: INFO nova.compute.claims [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1250.565759] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c53df56-9e6b-41a0-80f5-9fc056db70d2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.573322] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71b4eb36-477f-4bbc-8d13-a0f3440923cf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.602516] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63459d16-4edf-415d-bab3-2aa087acb1aa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.610020] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c5b20ba-3975-416a-8b43-1261c7a7b72f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.622490] env[67820]: DEBUG nova.compute.provider_tree [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1250.630993] env[67820]: DEBUG nova.scheduler.client.report [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1250.644070] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1250.644585] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1250.676339] env[67820]: DEBUG nova.compute.utils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1250.678088] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1250.678330] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1250.690233] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1250.739968] env[67820]: DEBUG nova.policy [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8516a08bf3fe45eaaf6172f738b30a6d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e46abdc851cc45b5bdf0b207237673c7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1250.752623] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1250.776887] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1250.777134] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1250.777291] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1250.777505] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1250.777691] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1250.777842] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1250.778058] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1250.778219] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1250.778384] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1250.778547] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1250.778715] env[67820]: DEBUG nova.virt.hardware [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1250.779578] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49daf5ec-1e58-4d8f-8188-4a182ad1c4b1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1250.787687] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15e09588-6d8e-4502-8580-63a987083f6f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1251.112670] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Successfully created port: 0aa1e58b-242b-4f74-bd33-ca376a9aaa4a {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1251.787493] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Successfully updated port: 0aa1e58b-242b-4f74-bd33-ca376a9aaa4a {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1251.799666] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "refresh_cache-f4d41e35-6408-4cd8-a7e0-52b030e56b40" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1251.799899] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquired lock "refresh_cache-f4d41e35-6408-4cd8-a7e0-52b030e56b40" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1251.800139] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1251.842185] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1251.996755] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Updating instance_info_cache with network_info: [{"id": "0aa1e58b-242b-4f74-bd33-ca376a9aaa4a", "address": "fa:16:3e:6e:20:ac", "network": {"id": "21b65d05-199a-4b06-a941-b2306a1ef5f1", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-437366391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e46abdc851cc45b5bdf0b207237673c7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0aa1e58b-24", "ovs_interfaceid": "0aa1e58b-242b-4f74-bd33-ca376a9aaa4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1252.010410] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Releasing lock "refresh_cache-f4d41e35-6408-4cd8-a7e0-52b030e56b40" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1252.010700] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Instance network_info: |[{"id": "0aa1e58b-242b-4f74-bd33-ca376a9aaa4a", "address": "fa:16:3e:6e:20:ac", "network": {"id": "21b65d05-199a-4b06-a941-b2306a1ef5f1", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-437366391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e46abdc851cc45b5bdf0b207237673c7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0aa1e58b-24", "ovs_interfaceid": "0aa1e58b-242b-4f74-bd33-ca376a9aaa4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1252.011105] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6e:20:ac', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1fb81f98-6f5a-47ab-a512-27277591d064', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0aa1e58b-242b-4f74-bd33-ca376a9aaa4a', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1252.019033] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Creating folder: Project (e46abdc851cc45b5bdf0b207237673c7). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1252.019448] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a4468214-07d5-40f6-9c4c-c464590b13e0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.028055] env[67820]: DEBUG nova.compute.manager [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Received event network-vif-plugged-0aa1e58b-242b-4f74-bd33-ca376a9aaa4a {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1252.028311] env[67820]: DEBUG oslo_concurrency.lockutils [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] Acquiring lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.028524] env[67820]: DEBUG oslo_concurrency.lockutils [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1252.028693] env[67820]: DEBUG oslo_concurrency.lockutils [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1252.028853] env[67820]: DEBUG nova.compute.manager [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] No waiting events found dispatching network-vif-plugged-0aa1e58b-242b-4f74-bd33-ca376a9aaa4a {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1252.029016] env[67820]: WARNING nova.compute.manager [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Received unexpected event network-vif-plugged-0aa1e58b-242b-4f74-bd33-ca376a9aaa4a for instance with vm_state building and task_state spawning. [ 1252.029177] env[67820]: DEBUG nova.compute.manager [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Received event network-changed-0aa1e58b-242b-4f74-bd33-ca376a9aaa4a {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1252.029328] env[67820]: DEBUG nova.compute.manager [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Refreshing instance network info cache due to event network-changed-0aa1e58b-242b-4f74-bd33-ca376a9aaa4a. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1252.029527] env[67820]: DEBUG oslo_concurrency.lockutils [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] Acquiring lock "refresh_cache-f4d41e35-6408-4cd8-a7e0-52b030e56b40" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1252.029681] env[67820]: DEBUG oslo_concurrency.lockutils [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] Acquired lock "refresh_cache-f4d41e35-6408-4cd8-a7e0-52b030e56b40" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1252.029838] env[67820]: DEBUG nova.network.neutron [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Refreshing network info cache for port 0aa1e58b-242b-4f74-bd33-ca376a9aaa4a {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1252.033141] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Created folder: Project (e46abdc851cc45b5bdf0b207237673c7) in parent group-v692668. [ 1252.033350] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Creating folder: Instances. Parent ref: group-v692734. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1252.033587] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a75319fc-2eb5-4cf2-8953-e07f53f85e8a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.043843] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Created folder: Instances in parent group-v692734. [ 1252.044083] env[67820]: DEBUG oslo.service.loopingcall [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1252.044540] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1252.044745] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ed8a5a4c-7f56-4f74-b340-910a64239bdb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.064575] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1252.064575] env[67820]: value = "task-3467385" [ 1252.064575] env[67820]: _type = "Task" [ 1252.064575] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1252.071809] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467385, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1252.170944] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1252.336986] env[67820]: DEBUG nova.network.neutron [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Updated VIF entry in instance network info cache for port 0aa1e58b-242b-4f74-bd33-ca376a9aaa4a. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1252.337426] env[67820]: DEBUG nova.network.neutron [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Updating instance_info_cache with network_info: [{"id": "0aa1e58b-242b-4f74-bd33-ca376a9aaa4a", "address": "fa:16:3e:6e:20:ac", "network": {"id": "21b65d05-199a-4b06-a941-b2306a1ef5f1", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-437366391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e46abdc851cc45b5bdf0b207237673c7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0aa1e58b-24", "ovs_interfaceid": "0aa1e58b-242b-4f74-bd33-ca376a9aaa4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1252.347756] env[67820]: DEBUG oslo_concurrency.lockutils [req-5295e00e-2f81-4f64-ab41-02dc85bdb78d req-7da0a3f8-fa5d-40d4-a6de-ca7cae273339 service nova] Releasing lock "refresh_cache-f4d41e35-6408-4cd8-a7e0-52b030e56b40" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1252.574087] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467385, 'name': CreateVM_Task, 'duration_secs': 0.356398} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1252.574256] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1252.574982] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1252.575155] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1252.575466] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1252.575719] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fe9bb630-aa8d-4a7e-aeb1-4469f354bcf6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1252.579875] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Waiting for the task: (returnval){ [ 1252.579875] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]520d2f5a-5640-ede7-12b0-443820df6b8b" [ 1252.579875] env[67820]: _type = "Task" [ 1252.579875] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1252.587126] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]520d2f5a-5640-ede7-12b0-443820df6b8b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1253.090233] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1253.090495] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1253.090730] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1258.841081] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1258.841343] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1264.682304] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1298.100080] env[67820]: WARNING oslo_vmware.rw_handles [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1298.100080] env[67820]: ERROR oslo_vmware.rw_handles [ 1298.100706] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1298.102813] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1298.103073] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Copying Virtual Disk [datastore1] vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/d1ac726c-3039-456a-a059-a07315127125/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1298.103382] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-392b3b20-d851-45a3-9a6d-b563a312ddcf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.111103] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Waiting for the task: (returnval){ [ 1298.111103] env[67820]: value = "task-3467386" [ 1298.111103] env[67820]: _type = "Task" [ 1298.111103] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1298.119548] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Task: {'id': task-3467386, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1298.622085] env[67820]: DEBUG oslo_vmware.exceptions [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1298.622085] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1298.622440] env[67820]: ERROR nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1298.622440] env[67820]: Faults: ['InvalidArgument'] [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Traceback (most recent call last): [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] yield resources [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self.driver.spawn(context, instance, image_meta, [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self._fetch_image_if_missing(context, vi) [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] image_cache(vi, tmp_image_ds_loc) [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] vm_util.copy_virtual_disk( [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] session._wait_for_task(vmdk_copy_task) [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] return self.wait_for_task(task_ref) [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] return evt.wait() [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] result = hub.switch() [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] return self.greenlet.switch() [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self.f(*self.args, **self.kw) [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] raise exceptions.translate_fault(task_info.error) [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Faults: ['InvalidArgument'] [ 1298.622440] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] [ 1298.624672] env[67820]: INFO nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Terminating instance [ 1298.624672] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1298.624672] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1298.624672] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f7e432d3-9851-4a66-bb1f-209a0fce0861 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.626704] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1298.626893] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1298.627622] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfb98b73-9820-4e36-b6c5-9472c205f6f2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.634334] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1298.634523] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-003db488-6b0a-4f28-a2c2-3e19d63fc167 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.636675] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1298.636856] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1298.637923] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fc41956f-fc25-446f-acca-4300681fb74f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.643150] env[67820]: DEBUG oslo_vmware.api [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for the task: (returnval){ [ 1298.643150] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52574714-4fb0-9634-268a-49ebf1c72f9f" [ 1298.643150] env[67820]: _type = "Task" [ 1298.643150] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1298.650065] env[67820]: DEBUG oslo_vmware.api [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52574714-4fb0-9634-268a-49ebf1c72f9f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1298.703343] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1298.703632] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1298.703747] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Deleting the datastore file [datastore1] 906c40dd-b6d6-492a-aa51-58901959a60d {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1298.703999] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2db64fd7-47a2-4d20-a939-80fa6ee6f435 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.710160] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Waiting for the task: (returnval){ [ 1298.710160] env[67820]: value = "task-3467388" [ 1298.710160] env[67820]: _type = "Task" [ 1298.710160] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1298.717488] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Task: {'id': task-3467388, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1299.154840] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1299.155255] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Creating directory with path [datastore1] vmware_temp/3cc93749-43ee-4b33-b06e-bf3510f1b368/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1299.155331] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11104d63-5287-4c52-bfd0-a2ef604b496b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.166725] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Created directory with path [datastore1] vmware_temp/3cc93749-43ee-4b33-b06e-bf3510f1b368/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1299.166725] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Fetch image to [datastore1] vmware_temp/3cc93749-43ee-4b33-b06e-bf3510f1b368/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1299.167030] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/3cc93749-43ee-4b33-b06e-bf3510f1b368/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1299.167660] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4be6baa1-04b9-44d3-8a1c-6442b0e52d58 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.174276] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d8db474-1466-477b-b073-9397c80d30b0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.183341] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93839c40-b533-4753-8ae4-8223c9bf7bfc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.218056] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-093fd6c7-5b02-41c2-9f8a-eb16aae56787 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.224116] env[67820]: DEBUG oslo_vmware.api [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Task: {'id': task-3467388, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074775} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1299.225578] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1299.225770] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1299.225946] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1299.226133] env[67820]: INFO nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1299.228412] env[67820]: DEBUG nova.compute.claims [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1299.228583] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1299.228792] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1299.231370] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3b74dc08-e9f5-4473-bd4e-d7d31029fa50 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.254416] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1299.405086] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1299.406719] env[67820]: ERROR nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = getattr(controller, method)(*args, **kwargs) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._get(image_id) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] resp, body = self.http_client.get(url, headers=header) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.request(url, 'GET', **kwargs) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._handle_response(resp) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exc.from_response(resp, resp.content) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During handling of the above exception, another exception occurred: [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] yield resources [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self.driver.spawn(context, instance, image_meta, [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._fetch_image_if_missing(context, vi) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image_fetch(context, vi, tmp_image_ds_loc) [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] images.fetch_image( [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1299.406719] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] metadata = IMAGE_API.get(context, image_ref) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return session.show(context, image_id, [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] _reraise_translated_image_exception(image_id) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise new_exc.with_traceback(exc_trace) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = getattr(controller, method)(*args, **kwargs) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._get(image_id) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] resp, body = self.http_client.get(url, headers=header) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.request(url, 'GET', **kwargs) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._handle_response(resp) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exc.from_response(resp, resp.content) [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1299.408064] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1299.408064] env[67820]: INFO nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Terminating instance [ 1299.409175] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1299.409394] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1299.412063] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1299.412263] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1299.412518] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5dfb7538-40cb-491a-871e-97a47988d809 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.415452] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83680e71-34cd-47eb-8d34-361d09f6a146 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.422791] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1299.423029] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d19646f3-00ac-401c-8283-da9db2db8601 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.425626] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1299.425807] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1299.428822] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9dc1322d-c591-4e05-b97d-81efa50c67f5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.433847] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1299.433847] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5282c109-b78e-0ddb-6fc6-4e119ecac1f4" [ 1299.433847] env[67820]: _type = "Task" [ 1299.433847] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1299.441659] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5282c109-b78e-0ddb-6fc6-4e119ecac1f4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1299.491103] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1299.491277] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1299.491449] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Deleting the datastore file [datastore1] 2311d6b7-32ab-45c6-83f8-9b341e847bf0 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1299.491697] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9e478cb0-c05b-46c0-a058-75677774c9a4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.497661] env[67820]: DEBUG oslo_vmware.api [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for the task: (returnval){ [ 1299.497661] env[67820]: value = "task-3467390" [ 1299.497661] env[67820]: _type = "Task" [ 1299.497661] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1299.502116] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3bc0e57-def4-4dfc-93cc-1b642d020fb4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.509403] env[67820]: DEBUG oslo_vmware.api [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': task-3467390, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1299.512220] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45e1d5ab-79a8-47a4-bf92-ecd8f53cf9fb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.541209] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a44c8e55-4fda-40d3-bba1-125f03d028c8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.548063] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4997446d-e771-4499-82a7-da73dbda9c1c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.561802] env[67820]: DEBUG nova.compute.provider_tree [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1299.570841] env[67820]: DEBUG nova.scheduler.client.report [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1299.584995] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.356s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1299.585507] env[67820]: ERROR nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1299.585507] env[67820]: Faults: ['InvalidArgument'] [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Traceback (most recent call last): [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self.driver.spawn(context, instance, image_meta, [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self._fetch_image_if_missing(context, vi) [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] image_cache(vi, tmp_image_ds_loc) [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] vm_util.copy_virtual_disk( [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] session._wait_for_task(vmdk_copy_task) [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] return self.wait_for_task(task_ref) [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] return evt.wait() [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] result = hub.switch() [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] return self.greenlet.switch() [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] self.f(*self.args, **self.kw) [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] raise exceptions.translate_fault(task_info.error) [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Faults: ['InvalidArgument'] [ 1299.585507] env[67820]: ERROR nova.compute.manager [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] [ 1299.586528] env[67820]: DEBUG nova.compute.utils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1299.587900] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Build of instance 906c40dd-b6d6-492a-aa51-58901959a60d was re-scheduled: A specified parameter was not correct: fileType [ 1299.587900] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1299.588292] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1299.588464] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1299.588629] env[67820]: DEBUG nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1299.588787] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1299.886458] env[67820]: DEBUG nova.network.neutron [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1299.897848] env[67820]: INFO nova.compute.manager [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Took 0.31 seconds to deallocate network for instance. [ 1299.944043] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1299.944728] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1299.944728] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b23cd15e-2214-4503-a2ce-18dd4e23e45e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1299.983218] env[67820]: INFO nova.scheduler.client.report [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Deleted allocations for instance 906c40dd-b6d6-492a-aa51-58901959a60d [ 1300.003431] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1300.004217] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Fetch image to [datastore1] vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1300.004217] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1300.005537] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e158bb-1299-4493-9ead-bc0cb89e39b5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.008520] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1d67a9f-d677-4977-af86-95580fe2718f tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "906c40dd-b6d6-492a-aa51-58901959a60d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 677.871s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.012683] env[67820]: DEBUG oslo_concurrency.lockutils [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "906c40dd-b6d6-492a-aa51-58901959a60d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 479.126s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.012909] env[67820]: DEBUG oslo_concurrency.lockutils [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Acquiring lock "906c40dd-b6d6-492a-aa51-58901959a60d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.013131] env[67820]: DEBUG oslo_concurrency.lockutils [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "906c40dd-b6d6-492a-aa51-58901959a60d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.013331] env[67820]: DEBUG oslo_concurrency.lockutils [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "906c40dd-b6d6-492a-aa51-58901959a60d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.016179] env[67820]: DEBUG oslo_vmware.api [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Task: {'id': task-3467390, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072032} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1300.016179] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1300.016179] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1300.016179] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1300.016605] env[67820]: INFO nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1300.018037] env[67820]: INFO nova.compute.manager [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Terminating instance [ 1300.022032] env[67820]: DEBUG nova.compute.manager [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1300.022227] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1300.023270] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2249345f-21a8-41f8-9d1d-d8c272000d99 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.026130] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ad643c03-6e57-4b5a-b141-dda1908cb895 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.028420] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1300.039256] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f602789-b453-4545-af5f-f6716c7124be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.046100] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e15a3e-aaca-4503-b084-eba9e43a8889 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.058157] env[67820]: DEBUG nova.compute.claims [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1300.058329] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.058541] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.097469] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be8accd6-bcb0-4753-b3d2-2803c033881c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.100708] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 906c40dd-b6d6-492a-aa51-58901959a60d could not be found. [ 1300.100912] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1300.101126] env[67820]: INFO nova.compute.manager [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Took 0.08 seconds to destroy the instance on the hypervisor. [ 1300.101335] env[67820]: DEBUG oslo.service.loopingcall [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1300.105766] env[67820]: DEBUG nova.compute.manager [-] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1300.105865] env[67820]: DEBUG nova.network.neutron [-] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1300.112436] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8819477d-cfbd-4a42-b297-1756cf478be6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.118872] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.133871] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1300.139379] env[67820]: DEBUG nova.network.neutron [-] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1300.148376] env[67820]: INFO nova.compute.manager [-] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] Took 0.04 seconds to deallocate network for instance. [ 1300.194043] env[67820]: DEBUG oslo_vmware.rw_handles [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1300.260618] env[67820]: DEBUG oslo_vmware.rw_handles [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1300.260822] env[67820]: DEBUG oslo_vmware.rw_handles [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1300.300975] env[67820]: DEBUG oslo_concurrency.lockutils [None req-243c65e5-8fb8-49e9-8d99-87a930591502 tempest-ImagesNegativeTestJSON-1053268250 tempest-ImagesNegativeTestJSON-1053268250-project-member] Lock "906c40dd-b6d6-492a-aa51-58901959a60d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.288s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.301844] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "906c40dd-b6d6-492a-aa51-58901959a60d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 160.698s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.302046] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 906c40dd-b6d6-492a-aa51-58901959a60d] During sync_power_state the instance has a pending task (deleting). Skip. [ 1300.302223] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "906c40dd-b6d6-492a-aa51-58901959a60d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.392406] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7038177e-824f-4cc7-9644-e8137c8c0650 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.400517] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de9a30e7-da01-4b71-9a4f-747157d6b1ec {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.432045] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ad086f1-8033-4482-a6f2-6c0d6833d83f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.439809] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac881eb-5db2-4e85-b7c8-e50a0a89cfa8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.454076] env[67820]: DEBUG nova.compute.provider_tree [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1300.463031] env[67820]: DEBUG nova.scheduler.client.report [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1300.475415] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.417s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.476154] env[67820]: ERROR nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = getattr(controller, method)(*args, **kwargs) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._get(image_id) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] resp, body = self.http_client.get(url, headers=header) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.request(url, 'GET', **kwargs) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._handle_response(resp) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exc.from_response(resp, resp.content) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During handling of the above exception, another exception occurred: [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self.driver.spawn(context, instance, image_meta, [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._fetch_image_if_missing(context, vi) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image_fetch(context, vi, tmp_image_ds_loc) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] images.fetch_image( [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] metadata = IMAGE_API.get(context, image_ref) [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1300.476154] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return session.show(context, image_id, [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] _reraise_translated_image_exception(image_id) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise new_exc.with_traceback(exc_trace) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = getattr(controller, method)(*args, **kwargs) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._get(image_id) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] resp, body = self.http_client.get(url, headers=header) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.request(url, 'GET', **kwargs) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._handle_response(resp) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exc.from_response(resp, resp.content) [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1300.477279] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.477279] env[67820]: DEBUG nova.compute.utils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1300.478060] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.359s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.479175] env[67820]: INFO nova.compute.claims [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1300.484054] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Build of instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 was re-scheduled: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1300.484054] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1300.484054] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1300.484054] env[67820]: DEBUG nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1300.484054] env[67820]: DEBUG nova.network.neutron [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1300.601757] env[67820]: DEBUG neutronclient.v2_0.client [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67820) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1300.604170] env[67820]: ERROR nova.compute.manager [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = getattr(controller, method)(*args, **kwargs) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._get(image_id) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] resp, body = self.http_client.get(url, headers=header) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.request(url, 'GET', **kwargs) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._handle_response(resp) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exc.from_response(resp, resp.content) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During handling of the above exception, another exception occurred: [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self.driver.spawn(context, instance, image_meta, [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._fetch_image_if_missing(context, vi) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image_fetch(context, vi, tmp_image_ds_loc) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] images.fetch_image( [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] metadata = IMAGE_API.get(context, image_ref) [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1300.604170] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return session.show(context, image_id, [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] _reraise_translated_image_exception(image_id) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise new_exc.with_traceback(exc_trace) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = getattr(controller, method)(*args, **kwargs) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._get(image_id) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] resp, body = self.http_client.get(url, headers=header) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.request(url, 'GET', **kwargs) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self._handle_response(resp) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exc.from_response(resp, resp.content) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During handling of the above exception, another exception occurred: [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._build_and_run_instance(context, instance, image, [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exception.RescheduledException( [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] nova.exception.RescheduledException: Build of instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 was re-scheduled: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During handling of the above exception, another exception occurred: [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] exception_handler_v20(status_code, error_body) [ 1300.605475] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise client_exc(message=error_message, [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Neutron server returns request_ids: ['req-2df46fc7-2186-468e-a9ed-06201c4bd5f7'] [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During handling of the above exception, another exception occurred: [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._deallocate_network(context, instance, requested_networks) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self.network_api.deallocate_for_instance( [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] data = neutron.list_ports(**search_opts) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.list('ports', self.ports_path, retrieve_all, [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] for r in self._pagination(collection, path, **params): [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] res = self.get(path, params=params) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.retry_request("GET", action, body=body, [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.do_request(method, action, body=body, [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._handle_fault_response(status_code, replybody, resp) [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exception.Unauthorized() [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] nova.exception.Unauthorized: Not authorized. [ 1300.606840] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.659797] env[67820]: INFO nova.scheduler.client.report [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Deleted allocations for instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 [ 1300.677606] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c656a1c4-6efc-413c-844b-d58c0cd3f5eb tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 632.457s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.680977] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 436.250s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.681227] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Acquiring lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.681446] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.681611] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.683505] env[67820]: INFO nova.compute.manager [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Terminating instance [ 1300.685110] env[67820]: DEBUG nova.compute.manager [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1300.685303] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1300.685763] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-211a8dce-72d3-4339-949d-40b47631afc4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.697391] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492d86a3-2b15-421d-969a-fd1685a7e314 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.709280] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1300.729103] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2311d6b7-32ab-45c6-83f8-9b341e847bf0 could not be found. [ 1300.729214] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1300.729387] env[67820]: INFO nova.compute.manager [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1300.729624] env[67820]: DEBUG oslo.service.loopingcall [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1300.729829] env[67820]: DEBUG nova.compute.manager [-] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1300.729925] env[67820]: DEBUG nova.network.neutron [-] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1300.751554] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4da227d-7d3f-4ef9-8fef-459cc5f284a2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.761096] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2561e09-de76-41f9-8822-8f9efbb7a592 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.764680] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.792997] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1ced444-596f-4b0c-8ba8-4d3264a81ea3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.800983] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adca37a7-7b9c-452d-95d5-45649cc332be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1300.813266] env[67820]: DEBUG nova.compute.provider_tree [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1300.822274] env[67820]: DEBUG nova.scheduler.client.report [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1300.836129] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.358s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.836611] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1300.839860] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.074s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.841071] env[67820]: INFO nova.compute.claims [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1300.842878] env[67820]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67820) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1300.843126] env[67820]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-5fd80c8f-860b-48b8-ae64-f3a890afdbe6'] [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1300.843882] env[67820]: ERROR oslo.service.loopingcall [ 1300.845553] env[67820]: ERROR nova.compute.manager [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1300.886665] env[67820]: ERROR nova.compute.manager [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] exception_handler_v20(status_code, error_body) [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise client_exc(message=error_message, [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Neutron server returns request_ids: ['req-5fd80c8f-860b-48b8-ae64-f3a890afdbe6'] [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During handling of the above exception, another exception occurred: [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Traceback (most recent call last): [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._delete_instance(context, instance, bdms) [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._shutdown_instance(context, instance, bdms) [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._try_deallocate_network(context, instance, requested_networks) [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] with excutils.save_and_reraise_exception(): [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self.force_reraise() [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise self.value [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] _deallocate_network_with_retries() [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return evt.wait() [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = hub.switch() [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.greenlet.switch() [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = func(*self.args, **self.kw) [ 1300.886665] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] result = f(*args, **kwargs) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._deallocate_network( [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self.network_api.deallocate_for_instance( [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] data = neutron.list_ports(**search_opts) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.list('ports', self.ports_path, retrieve_all, [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] for r in self._pagination(collection, path, **params): [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] res = self.get(path, params=params) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.retry_request("GET", action, body=body, [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] return self.do_request(method, action, body=body, [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] ret = obj(*args, **kwargs) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] self._handle_fault_response(status_code, replybody, resp) [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1300.888074] env[67820]: ERROR nova.compute.manager [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] [ 1300.889964] env[67820]: DEBUG nova.compute.utils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1300.891221] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1300.891796] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1300.900874] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1300.927302] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.246s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.928380] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 161.324s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1300.928560] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] During sync_power_state the instance has a pending task (deleting). Skip. [ 1300.928729] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "2311d6b7-32ab-45c6-83f8-9b341e847bf0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1300.974793] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1300.981946] env[67820]: DEBUG nova.policy [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd81a07fc3b4c470c8f5a087d8825c5df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '890ffca423414cd69eca6a6bf4d1ac66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1300.987787] env[67820]: INFO nova.compute.manager [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] [instance: 2311d6b7-32ab-45c6-83f8-9b341e847bf0] Successfully reverted task state from None on failure for instance. [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server [None req-ab9f2a5b-ef1a-436b-8127-f87b9eefd631 tempest-MigrationsAdminTest-82745144 tempest-MigrationsAdminTest-82745144-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-5fd80c8f-860b-48b8-ae64-f3a890afdbe6'] [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1300.992811] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1300.994425] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1300.996300] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1300.996300] env[67820]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1300.996300] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1300.996300] env[67820]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1300.996300] env[67820]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1300.996300] env[67820]: ERROR oslo_messaging.rpc.server [ 1301.004674] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1301.004898] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1301.005068] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1301.005254] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1301.005381] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1301.005525] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1301.005728] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1301.005884] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1301.006056] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1301.006220] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1301.006389] env[67820]: DEBUG nova.virt.hardware [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1301.007240] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1821bf2b-6f51-4643-8374-477cc5f9146b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.017895] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a2d6238-eaf4-480f-af0b-24ffbb21b5a5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.091238] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9e60db2-8ed5-43ce-baca-bd1872c096c8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.098064] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22bdfe88-7564-4335-8bc8-9c32087440da {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.129369] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09848135-ee8e-4c77-a429-062f27ba36be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.136551] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-739d672b-f5f5-40cf-844f-b04ed49137ce {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.149618] env[67820]: DEBUG nova.compute.provider_tree [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1301.160611] env[67820]: DEBUG nova.scheduler.client.report [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1301.175524] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.175997] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1301.211447] env[67820]: DEBUG nova.compute.utils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1301.215199] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1301.215199] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1301.222024] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1301.292308] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1301.298243] env[67820]: DEBUG nova.policy [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aac589c6933248f4931af9ebf3dbbde9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c6acf4fc89fa4b6391c4029070ea2773', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1301.316783] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Successfully created port: 7bee6944-b661-43d9-987d-3c958f806ef3 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1301.330274] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1301.330517] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1301.330728] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1301.330864] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1301.331017] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1301.331183] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1301.331391] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1301.331547] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1301.331711] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1301.331874] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1301.332170] env[67820]: DEBUG nova.virt.hardware [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1301.333344] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad44c6b6-ad5b-4328-a1b3-6fa8a79242be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.341857] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff380eee-6e9f-4c83-97b8-7004a645e1d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1301.621655] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1301.797457] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Successfully created port: 2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1302.005576] env[67820]: DEBUG nova.compute.manager [req-567d19cd-0088-4b80-8450-42db6a93fe96 req-f782f887-11f6-4952-9a33-8c89c9c21d9d service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Received event network-vif-plugged-7bee6944-b661-43d9-987d-3c958f806ef3 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1302.005576] env[67820]: DEBUG oslo_concurrency.lockutils [req-567d19cd-0088-4b80-8450-42db6a93fe96 req-f782f887-11f6-4952-9a33-8c89c9c21d9d service nova] Acquiring lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.005576] env[67820]: DEBUG oslo_concurrency.lockutils [req-567d19cd-0088-4b80-8450-42db6a93fe96 req-f782f887-11f6-4952-9a33-8c89c9c21d9d service nova] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1302.005576] env[67820]: DEBUG oslo_concurrency.lockutils [req-567d19cd-0088-4b80-8450-42db6a93fe96 req-f782f887-11f6-4952-9a33-8c89c9c21d9d service nova] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1302.005576] env[67820]: DEBUG nova.compute.manager [req-567d19cd-0088-4b80-8450-42db6a93fe96 req-f782f887-11f6-4952-9a33-8c89c9c21d9d service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] No waiting events found dispatching network-vif-plugged-7bee6944-b661-43d9-987d-3c958f806ef3 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1302.005576] env[67820]: WARNING nova.compute.manager [req-567d19cd-0088-4b80-8450-42db6a93fe96 req-f782f887-11f6-4952-9a33-8c89c9c21d9d service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Received unexpected event network-vif-plugged-7bee6944-b661-43d9-987d-3c958f806ef3 for instance with vm_state building and task_state spawning. [ 1302.082181] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Successfully created port: b4dec033-7966-4ee4-b4d4-d9bb89a5942f {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1302.291228] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Successfully updated port: 7bee6944-b661-43d9-987d-3c958f806ef3 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1302.307018] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "refresh_cache-3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1302.307018] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "refresh_cache-3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1302.307018] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1302.348852] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1302.516308] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Updating instance_info_cache with network_info: [{"id": "7bee6944-b661-43d9-987d-3c958f806ef3", "address": "fa:16:3e:5f:32:22", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7bee6944-b6", "ovs_interfaceid": "7bee6944-b661-43d9-987d-3c958f806ef3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1302.527122] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "refresh_cache-3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1302.527416] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Instance network_info: |[{"id": "7bee6944-b661-43d9-987d-3c958f806ef3", "address": "fa:16:3e:5f:32:22", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7bee6944-b6", "ovs_interfaceid": "7bee6944-b661-43d9-987d-3c958f806ef3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1302.527819] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5f:32:22', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db00ec2e-3155-46b6-8170-082f7d86dbe7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7bee6944-b661-43d9-987d-3c958f806ef3', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1302.535507] env[67820]: DEBUG oslo.service.loopingcall [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1302.535970] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1302.536213] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6108712c-1c10-49c5-b957-9e8ca5e543f8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1302.556517] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1302.556517] env[67820]: value = "task-3467391" [ 1302.556517] env[67820]: _type = "Task" [ 1302.556517] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1302.564284] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467391, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1302.956498] env[67820]: DEBUG oslo_concurrency.lockutils [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1302.976587] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Successfully updated port: 2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1303.021233] env[67820]: DEBUG nova.compute.manager [req-6f3e5bd8-6109-43f7-aa94-7505e38a40d7 req-be13e370-3fc2-4523-8b58-2860085689f1 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Received event network-vif-plugged-2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1303.021233] env[67820]: DEBUG oslo_concurrency.lockutils [req-6f3e5bd8-6109-43f7-aa94-7505e38a40d7 req-be13e370-3fc2-4523-8b58-2860085689f1 service nova] Acquiring lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1303.021233] env[67820]: DEBUG oslo_concurrency.lockutils [req-6f3e5bd8-6109-43f7-aa94-7505e38a40d7 req-be13e370-3fc2-4523-8b58-2860085689f1 service nova] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1303.021233] env[67820]: DEBUG oslo_concurrency.lockutils [req-6f3e5bd8-6109-43f7-aa94-7505e38a40d7 req-be13e370-3fc2-4523-8b58-2860085689f1 service nova] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1303.021233] env[67820]: DEBUG nova.compute.manager [req-6f3e5bd8-6109-43f7-aa94-7505e38a40d7 req-be13e370-3fc2-4523-8b58-2860085689f1 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] No waiting events found dispatching network-vif-plugged-2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1303.021233] env[67820]: WARNING nova.compute.manager [req-6f3e5bd8-6109-43f7-aa94-7505e38a40d7 req-be13e370-3fc2-4523-8b58-2860085689f1 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Received unexpected event network-vif-plugged-2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a for instance with vm_state building and task_state spawning. [ 1303.066716] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467391, 'name': CreateVM_Task, 'duration_secs': 0.296909} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1303.067156] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1303.067952] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1303.068296] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1303.068748] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1303.069137] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-73a4997d-2e16-4506-a069-51ba7efedb41 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1303.076703] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1303.076703] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5258b2ba-01fb-f048-e7f4-12449791f749" [ 1303.076703] env[67820]: _type = "Task" [ 1303.076703] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1303.082534] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5258b2ba-01fb-f048-e7f4-12449791f749, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1303.584614] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1303.584882] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1303.585459] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1303.687129] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Successfully updated port: b4dec033-7966-4ee4-b4d4-d9bb89a5942f {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1303.698418] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1303.698564] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1303.698714] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1303.736048] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1303.745054] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1303.745288] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1304.086018] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Updating instance_info_cache with network_info: [{"id": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "address": "fa:16:3e:31:60:2c", "network": {"id": "a4a4c3cb-c632-494c-a1ce-b0fd432fe401", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1551289276", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0734cc4-5718-45e2-9f98-0ded96880bef", "external-id": "nsx-vlan-transportzone-875", "segmentation_id": 875, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b6ac3f5-f9", "ovs_interfaceid": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "address": "fa:16:3e:e4:4b:2b", "network": {"id": "a09ebb8f-aa46-4b86-a161-331430142542", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-808430590", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.47", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb4dec033-79", "ovs_interfaceid": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1304.097639] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Releasing lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1304.098081] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Instance network_info: |[{"id": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "address": "fa:16:3e:31:60:2c", "network": {"id": "a4a4c3cb-c632-494c-a1ce-b0fd432fe401", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1551289276", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0734cc4-5718-45e2-9f98-0ded96880bef", "external-id": "nsx-vlan-transportzone-875", "segmentation_id": 875, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b6ac3f5-f9", "ovs_interfaceid": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "address": "fa:16:3e:e4:4b:2b", "network": {"id": "a09ebb8f-aa46-4b86-a161-331430142542", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-808430590", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.47", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb4dec033-79", "ovs_interfaceid": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1304.098497] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:31:60:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a0734cc4-5718-45e2-9f98-0ded96880bef', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:e4:4b:2b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad4fcde7-8926-402a-a9b7-4878d2bc1cf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b4dec033-7966-4ee4-b4d4-d9bb89a5942f', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1304.107855] env[67820]: DEBUG oslo.service.loopingcall [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1304.108373] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1304.108603] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3acd29c0-df1b-4391-a9a6-ffb0248af5c7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.130986] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1304.130986] env[67820]: value = "task-3467392" [ 1304.130986] env[67820]: _type = "Task" [ 1304.130986] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1304.138857] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467392, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1304.294500] env[67820]: DEBUG nova.compute.manager [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Received event network-changed-7bee6944-b661-43d9-987d-3c958f806ef3 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1304.294731] env[67820]: DEBUG nova.compute.manager [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Refreshing instance network info cache due to event network-changed-7bee6944-b661-43d9-987d-3c958f806ef3. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1304.294961] env[67820]: DEBUG oslo_concurrency.lockutils [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] Acquiring lock "refresh_cache-3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1304.295150] env[67820]: DEBUG oslo_concurrency.lockutils [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] Acquired lock "refresh_cache-3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1304.295567] env[67820]: DEBUG nova.network.neutron [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Refreshing network info cache for port 7bee6944-b661-43d9-987d-3c958f806ef3 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1304.569433] env[67820]: DEBUG nova.network.neutron [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Updated VIF entry in instance network info cache for port 7bee6944-b661-43d9-987d-3c958f806ef3. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1304.569802] env[67820]: DEBUG nova.network.neutron [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Updating instance_info_cache with network_info: [{"id": "7bee6944-b661-43d9-987d-3c958f806ef3", "address": "fa:16:3e:5f:32:22", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7bee6944-b6", "ovs_interfaceid": "7bee6944-b661-43d9-987d-3c958f806ef3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1304.579344] env[67820]: DEBUG oslo_concurrency.lockutils [req-91be3284-44da-4bce-8d75-acab6ce088f2 req-c594bc30-4cad-4972-9ad0-64e7970ab682 service nova] Releasing lock "refresh_cache-3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1304.640727] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467392, 'name': CreateVM_Task, 'duration_secs': 0.361758} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1304.641631] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1304.642141] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1304.642319] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1304.642629] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1304.642917] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bc0e3c2d-b7d3-408f-b2f3-51eddd6dacad {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1304.648217] env[67820]: DEBUG oslo_vmware.api [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for the task: (returnval){ [ 1304.648217] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d2fbd2-04ec-c547-5e9c-1b387533b939" [ 1304.648217] env[67820]: _type = "Task" [ 1304.648217] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1304.657700] env[67820]: DEBUG oslo_vmware.api [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d2fbd2-04ec-c547-5e9c-1b387533b939, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1305.083322] env[67820]: DEBUG nova.compute.manager [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Received event network-changed-2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1305.083534] env[67820]: DEBUG nova.compute.manager [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Refreshing instance network info cache due to event network-changed-2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1305.083785] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Acquiring lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1305.083923] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Acquired lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1305.084099] env[67820]: DEBUG nova.network.neutron [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Refreshing network info cache for port 2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1305.159523] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1305.159764] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1305.159970] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1305.386575] env[67820]: DEBUG nova.network.neutron [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Updated VIF entry in instance network info cache for port 2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1305.386989] env[67820]: DEBUG nova.network.neutron [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Updating instance_info_cache with network_info: [{"id": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "address": "fa:16:3e:31:60:2c", "network": {"id": "a4a4c3cb-c632-494c-a1ce-b0fd432fe401", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1551289276", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0734cc4-5718-45e2-9f98-0ded96880bef", "external-id": "nsx-vlan-transportzone-875", "segmentation_id": 875, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b6ac3f5-f9", "ovs_interfaceid": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "address": "fa:16:3e:e4:4b:2b", "network": {"id": "a09ebb8f-aa46-4b86-a161-331430142542", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-808430590", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.47", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb4dec033-79", "ovs_interfaceid": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1305.400035] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Releasing lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1305.400292] env[67820]: DEBUG nova.compute.manager [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Received event network-vif-plugged-b4dec033-7966-4ee4-b4d4-d9bb89a5942f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1305.400490] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Acquiring lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1305.400829] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1305.401087] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1305.401270] env[67820]: DEBUG nova.compute.manager [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] No waiting events found dispatching network-vif-plugged-b4dec033-7966-4ee4-b4d4-d9bb89a5942f {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1305.401441] env[67820]: WARNING nova.compute.manager [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Received unexpected event network-vif-plugged-b4dec033-7966-4ee4-b4d4-d9bb89a5942f for instance with vm_state building and task_state spawning. [ 1305.401603] env[67820]: DEBUG nova.compute.manager [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Received event network-changed-b4dec033-7966-4ee4-b4d4-d9bb89a5942f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1305.401760] env[67820]: DEBUG nova.compute.manager [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Refreshing instance network info cache due to event network-changed-b4dec033-7966-4ee4-b4d4-d9bb89a5942f. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1305.401949] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Acquiring lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1305.402100] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Acquired lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1305.402256] env[67820]: DEBUG nova.network.neutron [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Refreshing network info cache for port b4dec033-7966-4ee4-b4d4-d9bb89a5942f {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1305.621613] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1305.621832] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1305.621978] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1305.643744] env[67820]: DEBUG nova.network.neutron [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Updated VIF entry in instance network info cache for port b4dec033-7966-4ee4-b4d4-d9bb89a5942f. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1305.644367] env[67820]: DEBUG nova.network.neutron [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Updating instance_info_cache with network_info: [{"id": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "address": "fa:16:3e:31:60:2c", "network": {"id": "a4a4c3cb-c632-494c-a1ce-b0fd432fe401", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1551289276", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0734cc4-5718-45e2-9f98-0ded96880bef", "external-id": "nsx-vlan-transportzone-875", "segmentation_id": 875, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b6ac3f5-f9", "ovs_interfaceid": "2b6ac3f5-f976-4b1f-bdce-307fa2be5e2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "address": "fa:16:3e:e4:4b:2b", "network": {"id": "a09ebb8f-aa46-4b86-a161-331430142542", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-808430590", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.47", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "c6acf4fc89fa4b6391c4029070ea2773", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb4dec033-79", "ovs_interfaceid": "b4dec033-7966-4ee4-b4d4-d9bb89a5942f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1305.656856] env[67820]: DEBUG oslo_concurrency.lockutils [req-86017abc-0490-4df2-9122-9d4c9cc3dca4 req-ae5d31f2-9596-40b4-af4f-5149769a3242 service nova] Releasing lock "refresh_cache-d3dc6127-8512-4c5c-b04e-b6a639a1d1de" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1307.622436] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.622697] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1308.616105] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1308.620645] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1308.620802] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1308.620923] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1308.641500] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.641821] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.641821] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.641941] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.642110] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.642240] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.642359] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.642474] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.642589] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.642766] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1308.642905] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1310.621108] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1310.632643] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1310.632896] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1310.633077] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1310.633240] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1310.634357] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bc1bdd6-90b5-47c5-b8c8-bb40698bae96 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.643102] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1aab0cd1-f277-4e65-9043-80a3f0c7f875 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.656818] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04cc1d37-b00d-43f4-853b-b713a2296f0d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.663098] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48fd4b79-b92e-4c4b-9584-ae3376d1ae4c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.693548] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180888MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1310.693686] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1310.693762] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1310.764157] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.764333] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.764463] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.764584] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.764699] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.764813] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.764977] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.765081] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.765196] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.765312] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1310.776440] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1310.786803] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f0934337-b7e8-48ec-b30c-24c92c79267b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1310.796866] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4432bc55-aadb-4c3b-8b15-28edfbb40d66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1310.806890] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1310.815849] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1310.825081] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1310.825299] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1310.825442] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1311.010346] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd006eb3-e22a-4110-9bcf-85d173a88b2e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.017866] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d12f4d71-cb47-4d03-b752-b7672a3d72c8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.047141] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e13db6a9-737a-450e-b544-5f534d15211e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.054041] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-594356c8-cfdd-4c17-9405-924a1d67fb6d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.067560] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1311.075622] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1311.089902] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1311.090099] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.396s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1312.090859] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1348.749610] env[67820]: WARNING oslo_vmware.rw_handles [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1348.749610] env[67820]: ERROR oslo_vmware.rw_handles [ 1348.750133] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1348.752379] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1348.752635] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Copying Virtual Disk [datastore1] vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/17d98d46-bf55-4a82-a897-a16d1fc78921/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1348.752937] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-438072d9-b086-42f8-a570-7557effdebe2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1348.762113] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1348.762113] env[67820]: value = "task-3467393" [ 1348.762113] env[67820]: _type = "Task" [ 1348.762113] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1348.770032] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467393, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1349.272273] env[67820]: DEBUG oslo_vmware.exceptions [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1349.272568] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1349.273184] env[67820]: ERROR nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.273184] env[67820]: Faults: ['InvalidArgument'] [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Traceback (most recent call last): [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] yield resources [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self.driver.spawn(context, instance, image_meta, [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self._fetch_image_if_missing(context, vi) [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] image_cache(vi, tmp_image_ds_loc) [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] vm_util.copy_virtual_disk( [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] session._wait_for_task(vmdk_copy_task) [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] return self.wait_for_task(task_ref) [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] return evt.wait() [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] result = hub.switch() [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] return self.greenlet.switch() [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self.f(*self.args, **self.kw) [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] raise exceptions.translate_fault(task_info.error) [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Faults: ['InvalidArgument'] [ 1349.273184] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] [ 1349.273969] env[67820]: INFO nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Terminating instance [ 1349.275149] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1349.275308] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1349.275574] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f3b93209-787f-47a9-b191-4a8d8aa3d4c4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.278046] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1349.278238] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1349.278975] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06aa95b9-023f-4b89-b8fa-850ca09ca386 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.285840] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1349.286070] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-175deee5-2fb2-4915-a839-b0c421ca55c1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.288283] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1349.288401] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1349.289347] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ef6d9cfc-ef87-4ea3-a136-e05ebf7f995f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.294031] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for the task: (returnval){ [ 1349.294031] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52682451-cbef-8aee-1f16-95f279a3ce74" [ 1349.294031] env[67820]: _type = "Task" [ 1349.294031] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1349.301182] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52682451-cbef-8aee-1f16-95f279a3ce74, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1349.358825] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1349.359104] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1349.359223] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleting the datastore file [datastore1] d06b6984-d1d4-4afd-8ffd-f37407697d4b {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1349.359482] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d23f5e0f-260d-4655-a6f3-9e56265d4891 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.365313] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1349.365313] env[67820]: value = "task-3467395" [ 1349.365313] env[67820]: _type = "Task" [ 1349.365313] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1349.372734] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467395, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1349.804661] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1349.805097] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Creating directory with path [datastore1] vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1349.805155] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3a4d36be-3584-4ab0-a15b-a6ab91552487 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.816151] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Created directory with path [datastore1] vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1349.816329] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Fetch image to [datastore1] vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1349.816493] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1349.817206] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90791a8b-3217-41c1-8823-37d57bb26422 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.823325] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a96ef6b9-2dd2-48ac-80c3-88d0caab1c80 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.831951] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea70ec4d-b03b-4a13-9f34-b09f770803f7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.862690] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55ccd949-7ee6-4b55-8bcb-6b77abfdac62 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.871351] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-253160b2-565e-4b36-b80c-3ae054b2c29b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1349.877501] env[67820]: DEBUG oslo_vmware.api [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467395, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.060485} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1349.877729] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1349.877905] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1349.878087] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1349.878262] env[67820]: INFO nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1349.880444] env[67820]: DEBUG nova.compute.claims [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1349.880633] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1349.880901] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1349.890790] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1349.944080] env[67820]: DEBUG oslo_vmware.rw_handles [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1350.004058] env[67820]: DEBUG oslo_vmware.rw_handles [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1350.004261] env[67820]: DEBUG oslo_vmware.rw_handles [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1350.156498] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee37ec27-a5cf-431d-98ab-b84e14a3d84e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.163746] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a82eee-2308-4637-b3df-2e5fe8b98902 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.192562] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e9db70e-4c93-4307-a7f0-62bb81abe573 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.199017] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff964149-72cf-4ef2-b649-780b5f07bedd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.214521] env[67820]: DEBUG nova.compute.provider_tree [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1350.223544] env[67820]: DEBUG nova.scheduler.client.report [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1350.239364] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.358s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.239852] env[67820]: ERROR nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1350.239852] env[67820]: Faults: ['InvalidArgument'] [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Traceback (most recent call last): [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self.driver.spawn(context, instance, image_meta, [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self._fetch_image_if_missing(context, vi) [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] image_cache(vi, tmp_image_ds_loc) [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] vm_util.copy_virtual_disk( [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] session._wait_for_task(vmdk_copy_task) [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] return self.wait_for_task(task_ref) [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] return evt.wait() [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] result = hub.switch() [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] return self.greenlet.switch() [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] self.f(*self.args, **self.kw) [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] raise exceptions.translate_fault(task_info.error) [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Faults: ['InvalidArgument'] [ 1350.239852] env[67820]: ERROR nova.compute.manager [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] [ 1350.240825] env[67820]: DEBUG nova.compute.utils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1350.242463] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Build of instance d06b6984-d1d4-4afd-8ffd-f37407697d4b was re-scheduled: A specified parameter was not correct: fileType [ 1350.242463] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1350.242835] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1350.243032] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1350.243224] env[67820]: DEBUG nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1350.243388] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1350.602589] env[67820]: DEBUG nova.network.neutron [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1350.615385] env[67820]: INFO nova.compute.manager [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Took 0.37 seconds to deallocate network for instance. [ 1350.729173] env[67820]: INFO nova.scheduler.client.report [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted allocations for instance d06b6984-d1d4-4afd-8ffd-f37407697d4b [ 1350.752024] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d00801a8-5655-4025-8ef2-975d3b473a8f tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.511s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.753327] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.533s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.753548] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1350.753751] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.753916] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1350.760342] env[67820]: INFO nova.compute.manager [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Terminating instance [ 1350.762570] env[67820]: DEBUG nova.compute.manager [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1350.762704] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1350.762973] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f3b729aa-9470-4ae2-9f05-e645dfefd6c0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.766236] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1350.775362] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3662662-4081-4fbf-bbb1-a064cedd2a44 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1350.805916] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d06b6984-d1d4-4afd-8ffd-f37407697d4b could not be found. [ 1350.806278] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1350.806465] env[67820]: INFO nova.compute.manager [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1350.806708] env[67820]: DEBUG oslo.service.loopingcall [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1350.811361] env[67820]: DEBUG nova.compute.manager [-] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1350.811492] env[67820]: DEBUG nova.network.neutron [-] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1350.827729] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1350.828029] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1350.829583] env[67820]: INFO nova.compute.claims [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1351.000837] env[67820]: DEBUG nova.network.neutron [-] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1351.009090] env[67820]: INFO nova.compute.manager [-] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] Took 0.20 seconds to deallocate network for instance. [ 1351.093738] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a98a82b-579f-4814-bf70-a0524708c8fe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.100533] env[67820]: DEBUG oslo_concurrency.lockutils [None req-efe361e7-5849-422a-80a1-101fd5c09fdb tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.347s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.102020] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 211.498s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1351.102150] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d06b6984-d1d4-4afd-8ffd-f37407697d4b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1351.102325] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "d06b6984-d1d4-4afd-8ffd-f37407697d4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.106239] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28383a59-0432-4cf7-b314-068a8d9acffa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.805173] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a07796e-9ecf-465c-85bb-d2a8696a3558 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.813759] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65e4cc8e-5c2f-4c7d-bedb-d469ab8d70d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1351.826788] env[67820]: DEBUG nova.compute.provider_tree [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1351.835256] env[67820]: DEBUG nova.scheduler.client.report [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1351.851628] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.024s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1351.852222] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1351.887348] env[67820]: DEBUG nova.compute.utils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1351.888626] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1351.888793] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1351.899746] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1351.948910] env[67820]: DEBUG nova.policy [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef2ed74bd62c43369adae69975d0653c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '95f8209c21b64b61af651255904cac9c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1351.966244] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1351.992658] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1351.992909] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1351.993126] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1351.993321] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1351.993464] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1351.993619] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1351.993835] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1351.994025] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1351.994224] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1351.994390] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1351.994559] env[67820]: DEBUG nova.virt.hardware [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1351.995409] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91d5542c-18d9-465b-8cac-a2aba57bc6c8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1352.003545] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b54181eb-5261-4879-bc56-4de000e122d6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1352.279206] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Successfully created port: c3093ff0-e3c3-4702-af4b-9a83c443d7ed {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1352.856013] env[67820]: DEBUG nova.compute.manager [req-f1c479e1-bc3f-4d5f-8c9f-5f760b8f9c20 req-e8da9ae8-410e-4eab-9b6f-65b78983c47b service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Received event network-vif-plugged-c3093ff0-e3c3-4702-af4b-9a83c443d7ed {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1352.856252] env[67820]: DEBUG oslo_concurrency.lockutils [req-f1c479e1-bc3f-4d5f-8c9f-5f760b8f9c20 req-e8da9ae8-410e-4eab-9b6f-65b78983c47b service nova] Acquiring lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1352.856457] env[67820]: DEBUG oslo_concurrency.lockutils [req-f1c479e1-bc3f-4d5f-8c9f-5f760b8f9c20 req-e8da9ae8-410e-4eab-9b6f-65b78983c47b service nova] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1352.856621] env[67820]: DEBUG oslo_concurrency.lockutils [req-f1c479e1-bc3f-4d5f-8c9f-5f760b8f9c20 req-e8da9ae8-410e-4eab-9b6f-65b78983c47b service nova] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1352.856784] env[67820]: DEBUG nova.compute.manager [req-f1c479e1-bc3f-4d5f-8c9f-5f760b8f9c20 req-e8da9ae8-410e-4eab-9b6f-65b78983c47b service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] No waiting events found dispatching network-vif-plugged-c3093ff0-e3c3-4702-af4b-9a83c443d7ed {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1352.856944] env[67820]: WARNING nova.compute.manager [req-f1c479e1-bc3f-4d5f-8c9f-5f760b8f9c20 req-e8da9ae8-410e-4eab-9b6f-65b78983c47b service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Received unexpected event network-vif-plugged-c3093ff0-e3c3-4702-af4b-9a83c443d7ed for instance with vm_state building and task_state spawning. [ 1353.009250] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Successfully updated port: c3093ff0-e3c3-4702-af4b-9a83c443d7ed {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1353.022504] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "refresh_cache-8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1353.022929] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquired lock "refresh_cache-8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1353.022929] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1353.071397] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1353.272526] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Updating instance_info_cache with network_info: [{"id": "c3093ff0-e3c3-4702-af4b-9a83c443d7ed", "address": "fa:16:3e:03:5e:a5", "network": {"id": "7f15c2f8-ef70-40e3-809e-79c840149a24", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-935218426-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "95f8209c21b64b61af651255904cac9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f1e0e39-0c84-4fcd-9113-cc528c3eb185", "external-id": "nsx-vlan-transportzone-907", "segmentation_id": 907, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc3093ff0-e3", "ovs_interfaceid": "c3093ff0-e3c3-4702-af4b-9a83c443d7ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1353.285647] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Releasing lock "refresh_cache-8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1353.285925] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Instance network_info: |[{"id": "c3093ff0-e3c3-4702-af4b-9a83c443d7ed", "address": "fa:16:3e:03:5e:a5", "network": {"id": "7f15c2f8-ef70-40e3-809e-79c840149a24", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-935218426-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "95f8209c21b64b61af651255904cac9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f1e0e39-0c84-4fcd-9113-cc528c3eb185", "external-id": "nsx-vlan-transportzone-907", "segmentation_id": 907, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc3093ff0-e3", "ovs_interfaceid": "c3093ff0-e3c3-4702-af4b-9a83c443d7ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1353.286324] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:03:5e:a5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f1e0e39-0c84-4fcd-9113-cc528c3eb185', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c3093ff0-e3c3-4702-af4b-9a83c443d7ed', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1353.293967] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Creating folder: Project (95f8209c21b64b61af651255904cac9c). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1353.294503] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5501ab51-298c-42f7-b808-448954aa4665 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.304520] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Created folder: Project (95f8209c21b64b61af651255904cac9c) in parent group-v692668. [ 1353.304694] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Creating folder: Instances. Parent ref: group-v692739. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1353.304902] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0186a74c-00e7-470e-a7b6-9ec22412e9c6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.313829] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Created folder: Instances in parent group-v692739. [ 1353.314074] env[67820]: DEBUG oslo.service.loopingcall [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1353.314384] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1353.314445] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f541a522-abf4-4d5f-ab3e-ec129e8be96c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.333087] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1353.333087] env[67820]: value = "task-3467398" [ 1353.333087] env[67820]: _type = "Task" [ 1353.333087] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1353.340124] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467398, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1353.843507] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467398, 'name': CreateVM_Task, 'duration_secs': 0.302759} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1353.843842] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1353.847092] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1353.847092] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1353.847092] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1353.847092] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c67de066-7785-4440-b8c1-5e081b1d35e3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1353.853167] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Waiting for the task: (returnval){ [ 1353.853167] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]529b4205-1714-b6e7-cb9c-7ceb2ae4d4b4" [ 1353.853167] env[67820]: _type = "Task" [ 1353.853167] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1353.860104] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]529b4205-1714-b6e7-cb9c-7ceb2ae4d4b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1354.361714] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1354.361989] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1354.362220] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1354.900462] env[67820]: DEBUG nova.compute.manager [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Received event network-changed-c3093ff0-e3c3-4702-af4b-9a83c443d7ed {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1354.900701] env[67820]: DEBUG nova.compute.manager [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Refreshing instance network info cache due to event network-changed-c3093ff0-e3c3-4702-af4b-9a83c443d7ed. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1354.900974] env[67820]: DEBUG oslo_concurrency.lockutils [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] Acquiring lock "refresh_cache-8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1354.901153] env[67820]: DEBUG oslo_concurrency.lockutils [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] Acquired lock "refresh_cache-8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1354.901201] env[67820]: DEBUG nova.network.neutron [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Refreshing network info cache for port c3093ff0-e3c3-4702-af4b-9a83c443d7ed {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1355.213269] env[67820]: DEBUG nova.network.neutron [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Updated VIF entry in instance network info cache for port c3093ff0-e3c3-4702-af4b-9a83c443d7ed. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1355.213685] env[67820]: DEBUG nova.network.neutron [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Updating instance_info_cache with network_info: [{"id": "c3093ff0-e3c3-4702-af4b-9a83c443d7ed", "address": "fa:16:3e:03:5e:a5", "network": {"id": "7f15c2f8-ef70-40e3-809e-79c840149a24", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-935218426-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "95f8209c21b64b61af651255904cac9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f1e0e39-0c84-4fcd-9113-cc528c3eb185", "external-id": "nsx-vlan-transportzone-907", "segmentation_id": 907, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc3093ff0-e3", "ovs_interfaceid": "c3093ff0-e3c3-4702-af4b-9a83c443d7ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1355.224032] env[67820]: DEBUG oslo_concurrency.lockutils [req-461d1461-b02f-4315-bd92-c8f1a1dde78c req-14bd9bfa-c5de-44de-961a-cb964c6abdd9 service nova] Releasing lock "refresh_cache-8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1358.659073] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1358.660087] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1363.621801] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1364.616068] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.622191] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.622516] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.622581] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1368.621838] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.616405] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.621063] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1369.621228] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1369.621429] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1369.643210] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.643387] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.643510] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.643634] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.643754] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.643872] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.643989] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.644139] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.644266] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.644382] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1369.644496] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1369.644966] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1371.156249] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1371.621520] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1372.621416] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1372.633589] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1372.633706] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1372.633856] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1372.634042] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1372.635171] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9240dae8-8a07-4ed2-a3d2-b6623afd340d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.643758] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40daf14b-fb36-4074-a54c-000b05c29076 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.657116] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3bde412-c17d-4b91-bbe7-a3cb2cff1263 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.663123] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9154a955-3a7e-4565-973e-1424e79bc667 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1372.692614] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1372.692773] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1372.692964] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1372.763908] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 31ec9cab-abfb-4a73-8df8-057670201267 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764145] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764278] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764404] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764531] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764641] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764759] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764875] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.764987] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.765116] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1372.776209] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f0934337-b7e8-48ec-b30c-24c92c79267b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1372.786302] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4432bc55-aadb-4c3b-8b15-28edfbb40d66 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1372.796053] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1372.805669] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1372.814494] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1372.824533] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1372.824750] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1372.824894] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1373.007472] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8353eeae-20a1-4ce5-a9a2-dfc49f63f535 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.014859] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2afe20d6-0eeb-4462-8ada-5324758908e1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.043479] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8ae141c-5c80-4e5d-a21c-2cf78b7f13d7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.050054] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cb4e8fd-2c23-467d-aa77-2ab9543f21db {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1373.063380] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1373.072044] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1373.084683] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1373.084863] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1375.858579] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1392.008279] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c95813c5-5516-4f78-9c6b-a7b04cef4292 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "e304b657-1f29-46e5-9f52-8809f8b29606" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1392.008616] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c95813c5-5516-4f78-9c6b-a7b04cef4292 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "e304b657-1f29-46e5-9f52-8809f8b29606" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1398.134702] env[67820]: WARNING oslo_vmware.rw_handles [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1398.134702] env[67820]: ERROR oslo_vmware.rw_handles [ 1398.135264] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1398.137630] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1398.137883] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Copying Virtual Disk [datastore1] vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/0f6f7a5e-fa9f-4bad-a51f-8f525caf5a43/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1398.138177] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-798669b9-c66b-48a8-bf58-88b6ea8897b4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.145996] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for the task: (returnval){ [ 1398.145996] env[67820]: value = "task-3467399" [ 1398.145996] env[67820]: _type = "Task" [ 1398.145996] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1398.153704] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': task-3467399, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1398.656697] env[67820]: DEBUG oslo_vmware.exceptions [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1398.656985] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1398.657591] env[67820]: ERROR nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1398.657591] env[67820]: Faults: ['InvalidArgument'] [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Traceback (most recent call last): [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] yield resources [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self.driver.spawn(context, instance, image_meta, [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self._fetch_image_if_missing(context, vi) [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] image_cache(vi, tmp_image_ds_loc) [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] vm_util.copy_virtual_disk( [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] session._wait_for_task(vmdk_copy_task) [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] return self.wait_for_task(task_ref) [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] return evt.wait() [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] result = hub.switch() [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] return self.greenlet.switch() [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self.f(*self.args, **self.kw) [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] raise exceptions.translate_fault(task_info.error) [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Faults: ['InvalidArgument'] [ 1398.657591] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] [ 1398.658409] env[67820]: INFO nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Terminating instance [ 1398.659653] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1398.659837] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1398.660490] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1398.660686] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1398.660909] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac860b7c-9759-49d2-b521-2539b505bf08 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.663576] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d5baab1-3858-4bd5-b514-34c817c9ab95 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.671047] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1398.671663] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e21a771c-a433-4138-8f5a-4dfb6751873b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.674369] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1398.674369] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1398.675270] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-014eb9de-80c8-4162-81c0-dff9240a8d60 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.680345] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Waiting for the task: (returnval){ [ 1398.680345] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52669583-348c-33b5-c92b-097b562a2d41" [ 1398.680345] env[67820]: _type = "Task" [ 1398.680345] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1398.688310] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52669583-348c-33b5-c92b-097b562a2d41, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1398.796261] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1398.796261] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1398.796470] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Deleting the datastore file [datastore1] 31ec9cab-abfb-4a73-8df8-057670201267 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1398.796720] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7b9fa16a-39e9-46fc-85e4-107e24707f52 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1398.802897] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for the task: (returnval){ [ 1398.802897] env[67820]: value = "task-3467401" [ 1398.802897] env[67820]: _type = "Task" [ 1398.802897] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1398.812854] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': task-3467401, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1399.193891] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1399.194548] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Creating directory with path [datastore1] vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1399.194548] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e354c014-6768-4521-810d-83d29aa9c673 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.206920] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Created directory with path [datastore1] vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1399.206920] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Fetch image to [datastore1] vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1399.207076] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1399.207821] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17f29254-71a9-4a10-a20e-3667ed26bfb5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.216665] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-340f0100-33ca-4f5a-a78c-377373d1edc8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.228120] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec78ff75-2564-4384-9b6b-cc84c5fd4acb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.266609] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fedffa88-e07e-4f79-beea-4a43cd798083 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.272627] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e4275a24-33ff-4721-ba34-b30176d86481 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.292723] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1399.312741] env[67820]: DEBUG oslo_vmware.api [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': task-3467401, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.279911} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1399.312980] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1399.314582] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1399.314582] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1399.314582] env[67820]: INFO nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1399.321421] env[67820]: DEBUG nova.compute.claims [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1399.321421] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1399.321421] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1399.368471] env[67820]: DEBUG oslo_vmware.rw_handles [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1399.440681] env[67820]: DEBUG oslo_vmware.rw_handles [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1399.442025] env[67820]: DEBUG oslo_vmware.rw_handles [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1399.655754] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cb0e5d2-b2e5-4c68-86c3-385ca74bc29f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.664436] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e73c358-3f16-44f3-8419-f73510fc4b25 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.699768] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f25993a6-491c-460e-b3d4-3bc5cf136887 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.706847] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b279f15-1bbd-4daa-96a2-780a9079fa59 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1399.723223] env[67820]: DEBUG nova.compute.provider_tree [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1399.738018] env[67820]: DEBUG nova.scheduler.client.report [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1399.753399] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.432s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1399.756027] env[67820]: ERROR nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1399.756027] env[67820]: Faults: ['InvalidArgument'] [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Traceback (most recent call last): [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self.driver.spawn(context, instance, image_meta, [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self._fetch_image_if_missing(context, vi) [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] image_cache(vi, tmp_image_ds_loc) [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] vm_util.copy_virtual_disk( [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] session._wait_for_task(vmdk_copy_task) [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] return self.wait_for_task(task_ref) [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] return evt.wait() [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] result = hub.switch() [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] return self.greenlet.switch() [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] self.f(*self.args, **self.kw) [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] raise exceptions.translate_fault(task_info.error) [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Faults: ['InvalidArgument'] [ 1399.756027] env[67820]: ERROR nova.compute.manager [instance: 31ec9cab-abfb-4a73-8df8-057670201267] [ 1399.756027] env[67820]: DEBUG nova.compute.utils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1399.761018] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Build of instance 31ec9cab-abfb-4a73-8df8-057670201267 was re-scheduled: A specified parameter was not correct: fileType [ 1399.761018] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1399.761018] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1399.761018] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1399.761018] env[67820]: DEBUG nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1399.761018] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1401.541552] env[67820]: DEBUG nova.network.neutron [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1401.556302] env[67820]: INFO nova.compute.manager [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Took 1.80 seconds to deallocate network for instance. [ 1401.660413] env[67820]: INFO nova.scheduler.client.report [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Deleted allocations for instance 31ec9cab-abfb-4a73-8df8-057670201267 [ 1401.685639] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9fddbbfa-608e-42b3-a1cc-80d3c392057e tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "31ec9cab-abfb-4a73-8df8-057670201267" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 620.465s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.686949] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "31ec9cab-abfb-4a73-8df8-057670201267" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.751s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.687441] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "31ec9cab-abfb-4a73-8df8-057670201267-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1401.688282] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.688282] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "31ec9cab-abfb-4a73-8df8-057670201267-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.690907] env[67820]: INFO nova.compute.manager [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Terminating instance [ 1401.693730] env[67820]: DEBUG nova.compute.manager [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1401.693951] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1401.694129] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c51ada29-a925-4898-afac-91495c98b800 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.699386] env[67820]: DEBUG nova.compute.manager [None req-e288bc1c-22a9-4c3f-bd13-8b1403fdf25f tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: f0934337-b7e8-48ec-b30c-24c92c79267b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1401.708463] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58af669b-77d8-4b2a-9fe6-c46026e55958 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1401.721723] env[67820]: DEBUG nova.compute.manager [None req-e288bc1c-22a9-4c3f-bd13-8b1403fdf25f tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] [instance: f0934337-b7e8-48ec-b30c-24c92c79267b] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1401.738856] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 31ec9cab-abfb-4a73-8df8-057670201267 could not be found. [ 1401.739080] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1401.739256] env[67820]: INFO nova.compute.manager [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1401.739492] env[67820]: DEBUG oslo.service.loopingcall [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1401.741592] env[67820]: DEBUG nova.compute.manager [-] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1401.741696] env[67820]: DEBUG nova.network.neutron [-] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1401.753726] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e288bc1c-22a9-4c3f-bd13-8b1403fdf25f tempest-ImagesTestJSON-640122971 tempest-ImagesTestJSON-640122971-project-member] Lock "f0934337-b7e8-48ec-b30c-24c92c79267b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.108s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.764631] env[67820]: DEBUG nova.compute.manager [None req-012fc784-4067-4078-860c-cc16e44e0cb4 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: 4432bc55-aadb-4c3b-8b15-28edfbb40d66] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1401.767398] env[67820]: DEBUG nova.network.neutron [-] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1401.774909] env[67820]: INFO nova.compute.manager [-] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] Took 0.03 seconds to deallocate network for instance. [ 1401.806430] env[67820]: DEBUG nova.compute.manager [None req-012fc784-4067-4078-860c-cc16e44e0cb4 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: 4432bc55-aadb-4c3b-8b15-28edfbb40d66] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1401.833586] env[67820]: DEBUG oslo_concurrency.lockutils [None req-012fc784-4067-4078-860c-cc16e44e0cb4 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "4432bc55-aadb-4c3b-8b15-28edfbb40d66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.266s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.845200] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1401.901029] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37d2300e-6379-409f-aff9-17c645169901 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "31ec9cab-abfb-4a73-8df8-057670201267" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.214s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.902270] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "31ec9cab-abfb-4a73-8df8-057670201267" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 262.298s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.902568] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 31ec9cab-abfb-4a73-8df8-057670201267] During sync_power_state the instance has a pending task (deleting). Skip. [ 1401.902759] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "31ec9cab-abfb-4a73-8df8-057670201267" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1401.910805] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1401.911120] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1401.912480] env[67820]: INFO nova.compute.claims [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1402.148658] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abc1b959-c5dd-4d9b-b2c7-e3ed452f2c2a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.156276] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-508e38fe-0acd-4532-a5cb-5767eedfbfbb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.186899] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48a32bad-9fcc-4727-bb15-328c0444246c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.194157] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fe72d0b-9fdc-4c95-a6ea-db33d72fc2f3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.207399] env[67820]: DEBUG nova.compute.provider_tree [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1402.216290] env[67820]: DEBUG nova.scheduler.client.report [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1402.232476] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1402.233141] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1402.266601] env[67820]: DEBUG nova.compute.utils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1402.268061] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1402.268236] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1402.299534] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1402.373581] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1402.393370] env[67820]: DEBUG nova.policy [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8b9868addae45a49b19e7058f737988', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '83044475bfd24b14a5a95b4b3fa0376c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1402.405326] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1402.405661] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1402.405744] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1402.405930] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1402.406088] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1402.406238] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1402.406489] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1402.406641] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1402.406812] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1402.406970] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1402.407237] env[67820]: DEBUG nova.virt.hardware [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1402.408985] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-445b89ba-2637-4766-a14c-c4539d8bd095 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.416470] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1a12c24-5d1c-4ce5-8832-f48ac8e1a015 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.932537] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Successfully created port: 2f4b1a5f-aec1-46b7-967b-275d67bb8ef6 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1403.430382] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1403.430629] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1403.661397] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Successfully updated port: 2f4b1a5f-aec1-46b7-967b-275d67bb8ef6 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1403.677912] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "refresh_cache-11320faf-fa01-49c8-9d96-af9a4f6c5095" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1403.677912] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired lock "refresh_cache-11320faf-fa01-49c8-9d96-af9a4f6c5095" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1403.677912] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1403.713359] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1403.883460] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Updating instance_info_cache with network_info: [{"id": "2f4b1a5f-aec1-46b7-967b-275d67bb8ef6", "address": "fa:16:3e:c8:b1:29", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2f4b1a5f-ae", "ovs_interfaceid": "2f4b1a5f-aec1-46b7-967b-275d67bb8ef6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1403.898252] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Releasing lock "refresh_cache-11320faf-fa01-49c8-9d96-af9a4f6c5095" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1403.898562] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Instance network_info: |[{"id": "2f4b1a5f-aec1-46b7-967b-275d67bb8ef6", "address": "fa:16:3e:c8:b1:29", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2f4b1a5f-ae", "ovs_interfaceid": "2f4b1a5f-aec1-46b7-967b-275d67bb8ef6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1403.898961] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c8:b1:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '365ac5b1-6d83-4dfe-887f-60574d7f6124', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2f4b1a5f-aec1-46b7-967b-275d67bb8ef6', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1403.906505] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Creating folder: Project (83044475bfd24b14a5a95b4b3fa0376c). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1403.907051] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-de24bffd-65d0-4af3-a6f4-8f214860bd32 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1403.919113] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Created folder: Project (83044475bfd24b14a5a95b4b3fa0376c) in parent group-v692668. [ 1403.919307] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Creating folder: Instances. Parent ref: group-v692742. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1403.919539] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-68360941-8a0c-4937-a4ef-a18319522959 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1403.928620] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Created folder: Instances in parent group-v692742. [ 1403.928840] env[67820]: DEBUG oslo.service.loopingcall [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1403.929031] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1403.929227] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f6a6ccc9-f759-42cb-8700-bce087246457 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1403.948101] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1403.948101] env[67820]: value = "task-3467404" [ 1403.948101] env[67820]: _type = "Task" [ 1403.948101] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1403.955435] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467404, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1404.462293] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467404, 'name': CreateVM_Task, 'duration_secs': 0.286881} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1404.462540] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1404.463418] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1404.463618] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1404.464044] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1404.464359] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-34443496-a09e-43ad-bc34-1c16d7a9b471 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1404.469804] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for the task: (returnval){ [ 1404.469804] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]529336bb-bfd2-70fd-262a-72842866653f" [ 1404.469804] env[67820]: _type = "Task" [ 1404.469804] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1404.478889] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]529336bb-bfd2-70fd-262a-72842866653f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1404.651080] env[67820]: DEBUG nova.compute.manager [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Received event network-vif-plugged-2f4b1a5f-aec1-46b7-967b-275d67bb8ef6 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1404.651080] env[67820]: DEBUG oslo_concurrency.lockutils [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] Acquiring lock "11320faf-fa01-49c8-9d96-af9a4f6c5095-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1404.651080] env[67820]: DEBUG oslo_concurrency.lockutils [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1404.651080] env[67820]: DEBUG oslo_concurrency.lockutils [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1404.651080] env[67820]: DEBUG nova.compute.manager [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] No waiting events found dispatching network-vif-plugged-2f4b1a5f-aec1-46b7-967b-275d67bb8ef6 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1404.651304] env[67820]: WARNING nova.compute.manager [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Received unexpected event network-vif-plugged-2f4b1a5f-aec1-46b7-967b-275d67bb8ef6 for instance with vm_state building and task_state spawning. [ 1404.651485] env[67820]: DEBUG nova.compute.manager [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Received event network-changed-2f4b1a5f-aec1-46b7-967b-275d67bb8ef6 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1404.651645] env[67820]: DEBUG nova.compute.manager [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Refreshing instance network info cache due to event network-changed-2f4b1a5f-aec1-46b7-967b-275d67bb8ef6. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1404.651826] env[67820]: DEBUG oslo_concurrency.lockutils [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] Acquiring lock "refresh_cache-11320faf-fa01-49c8-9d96-af9a4f6c5095" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1404.651962] env[67820]: DEBUG oslo_concurrency.lockutils [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] Acquired lock "refresh_cache-11320faf-fa01-49c8-9d96-af9a4f6c5095" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1404.652128] env[67820]: DEBUG nova.network.neutron [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Refreshing network info cache for port 2f4b1a5f-aec1-46b7-967b-275d67bb8ef6 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1404.981131] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1404.981574] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1404.981574] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1405.204662] env[67820]: DEBUG nova.network.neutron [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Updated VIF entry in instance network info cache for port 2f4b1a5f-aec1-46b7-967b-275d67bb8ef6. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1405.205104] env[67820]: DEBUG nova.network.neutron [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Updating instance_info_cache with network_info: [{"id": "2f4b1a5f-aec1-46b7-967b-275d67bb8ef6", "address": "fa:16:3e:c8:b1:29", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2f4b1a5f-ae", "ovs_interfaceid": "2f4b1a5f-aec1-46b7-967b-275d67bb8ef6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1405.216795] env[67820]: DEBUG oslo_concurrency.lockutils [req-68ae430a-15be-4e45-a1a6-d0e9b003f335 req-11549d44-4273-412a-88fb-b8fb92aa0fcb service nova] Releasing lock "refresh_cache-11320faf-fa01-49c8-9d96-af9a4f6c5095" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1416.627124] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "abcea602-a4fc-4dea-9261-a0111db20f84" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.627124] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "abcea602-a4fc-4dea-9261-a0111db20f84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.653565] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "0e0e1852-25a6-42dd-9d5a-08af14e6423a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.654021] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "0e0e1852-25a6-42dd-9d5a-08af14e6423a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1418.621820] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1418.622171] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1418.633448] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] There are 0 instances to clean {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1420.621694] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1420.621977] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances with incomplete migration {{(pid=67820) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1425.631700] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1426.621825] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1427.630403] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1427.630749] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1428.622780] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.622301] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.622587] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1429.622587] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1429.644909] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.645339] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.645568] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.645719] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.645866] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.646015] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.646148] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.646269] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.646389] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.646506] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1429.646628] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1429.647132] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1431.621714] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1431.621714] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1432.622641] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1432.635064] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1432.635261] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1432.635443] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1432.635614] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1432.637151] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-913bc356-1642-4eac-93c2-db5a7746ee27 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.645854] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1024c3c4-71f3-414f-a24c-6bd74a1e993f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.659920] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c734c309-401c-407b-aac0-3e524575fb19 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.666117] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef5093db-7216-4b06-86e8-cd4a0c90b17c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.694565] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180886MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1432.694925] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1432.694925] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1432.842688] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance cfc7ee69-6da9-4f70-b245-17b12674feeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.842863] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.842989] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.843127] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.843247] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.843365] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.843484] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.843666] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.843795] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.843911] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1432.854960] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1432.866025] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1432.875057] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1432.884887] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e304b657-1f29-46e5-9f52-8809f8b29606 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1432.894536] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1432.904620] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance abcea602-a4fc-4dea-9261-a0111db20f84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1432.913767] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0e0e1852-25a6-42dd-9d5a-08af14e6423a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1432.914038] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1432.914194] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1432.929469] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing inventories for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1432.943299] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating ProviderTree inventory for provider 0f792661-ec04-4fc2-898f-e9860339eddd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1432.943540] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1432.953903] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing aggregate associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, aggregates: None {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1432.973986] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing trait associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, traits: COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1433.157788] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8955cde-6ced-40c1-9d07-a1d6f2a59ec8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.165304] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-812d7659-dda4-4165-9d8c-a17d11b96955 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.195280] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe8600e4-a1ee-4402-aa0f-6a8f0236e448 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.202266] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b76793f-6c25-4385-8976-69bbe642c5e6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.215298] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1433.223872] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1433.237507] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1433.237692] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1434.237200] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1438.468469] env[67820]: DEBUG oslo_concurrency.lockutils [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1444.919063] env[67820]: WARNING oslo_vmware.rw_handles [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1444.919063] env[67820]: ERROR oslo_vmware.rw_handles [ 1444.919063] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1444.920927] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1444.921182] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Copying Virtual Disk [datastore1] vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/dcffeeac-db81-4e99-a715-df33e3cd2c7e/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1444.921468] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a9c5aae3-0521-4439-a4a7-de7cc8cffd33 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1444.929541] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Waiting for the task: (returnval){ [ 1444.929541] env[67820]: value = "task-3467405" [ 1444.929541] env[67820]: _type = "Task" [ 1444.929541] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1444.937426] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Task: {'id': task-3467405, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1445.441037] env[67820]: DEBUG oslo_vmware.exceptions [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1445.441169] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1445.441746] env[67820]: ERROR nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1445.441746] env[67820]: Faults: ['InvalidArgument'] [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Traceback (most recent call last): [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] yield resources [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self.driver.spawn(context, instance, image_meta, [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self._fetch_image_if_missing(context, vi) [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] image_cache(vi, tmp_image_ds_loc) [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] vm_util.copy_virtual_disk( [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] session._wait_for_task(vmdk_copy_task) [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] return self.wait_for_task(task_ref) [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] return evt.wait() [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] result = hub.switch() [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] return self.greenlet.switch() [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self.f(*self.args, **self.kw) [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] raise exceptions.translate_fault(task_info.error) [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Faults: ['InvalidArgument'] [ 1445.441746] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] [ 1445.442692] env[67820]: INFO nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Terminating instance [ 1445.443781] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1445.444425] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1445.444870] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-45f3cbe3-1bdf-47b9-9372-819f6ee35104 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.447094] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1445.447290] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1445.448120] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1eced68-6e15-49fc-be88-02b4f849d960 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.455056] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1445.455152] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-497d25e4-d3a5-484b-b1dc-f922ab521a98 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.457327] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1445.457495] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1445.458426] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bf7db797-0935-4388-89db-8f2e1ec1b190 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.463194] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1445.463194] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52ad8a24-1651-ecfc-85ef-7ca36dd87495" [ 1445.463194] env[67820]: _type = "Task" [ 1445.463194] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1445.477416] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1445.477637] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating directory with path [datastore1] vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1445.477849] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-73b8036b-2dfa-4ea3-8184-89e86f2534ff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.497946] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Created directory with path [datastore1] vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1445.498166] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Fetch image to [datastore1] vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1445.498337] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1445.499098] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8dbb589-fdd6-42fc-a54b-3961ca6008c3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.505720] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2b8c670-4e16-445f-8f65-69d5d2d3692b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.514628] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3285b32a-4a25-409d-9418-b441e8db1c07 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.547862] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba922205-d77d-42ae-bf73-30e3cef6069b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.550432] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1445.550629] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1445.550801] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Deleting the datastore file [datastore1] cfc7ee69-6da9-4f70-b245-17b12674feeb {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1445.551038] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c67c8338-6f2f-44da-b46a-eadec1f603d8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.557249] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a20a9766-289d-4b69-8781-d0a23f3fa570 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.558955] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Waiting for the task: (returnval){ [ 1445.558955] env[67820]: value = "task-3467407" [ 1445.558955] env[67820]: _type = "Task" [ 1445.558955] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1445.566463] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Task: {'id': task-3467407, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1445.579698] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1445.630248] env[67820]: DEBUG oslo_vmware.rw_handles [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1445.690410] env[67820]: DEBUG oslo_vmware.rw_handles [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1445.690410] env[67820]: DEBUG oslo_vmware.rw_handles [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1446.068527] env[67820]: DEBUG oslo_vmware.api [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Task: {'id': task-3467407, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071945} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1446.068854] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1446.068894] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1446.069073] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1446.069256] env[67820]: INFO nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1446.071401] env[67820]: DEBUG nova.compute.claims [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1446.071572] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1446.071810] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1446.318409] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea78351-744b-4c1c-8305-c3bd831d0166 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.327043] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac7632b0-1312-432a-b2ad-19074843cd33 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.358063] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-002d7783-17d8-40d2-b459-c331736f4b2e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.365345] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae275eff-b042-4357-a623-69460150236f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.379521] env[67820]: DEBUG nova.compute.provider_tree [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1446.388448] env[67820]: DEBUG nova.scheduler.client.report [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1446.403291] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.331s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1446.403820] env[67820]: ERROR nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1446.403820] env[67820]: Faults: ['InvalidArgument'] [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Traceback (most recent call last): [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self.driver.spawn(context, instance, image_meta, [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self._fetch_image_if_missing(context, vi) [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] image_cache(vi, tmp_image_ds_loc) [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] vm_util.copy_virtual_disk( [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] session._wait_for_task(vmdk_copy_task) [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] return self.wait_for_task(task_ref) [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] return evt.wait() [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] result = hub.switch() [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] return self.greenlet.switch() [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] self.f(*self.args, **self.kw) [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] raise exceptions.translate_fault(task_info.error) [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Faults: ['InvalidArgument'] [ 1446.403820] env[67820]: ERROR nova.compute.manager [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] [ 1446.404640] env[67820]: DEBUG nova.compute.utils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1446.406227] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Build of instance cfc7ee69-6da9-4f70-b245-17b12674feeb was re-scheduled: A specified parameter was not correct: fileType [ 1446.406227] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1446.406601] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1446.406771] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1446.406941] env[67820]: DEBUG nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1446.407117] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1446.936845] env[67820]: DEBUG nova.network.neutron [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1446.951743] env[67820]: INFO nova.compute.manager [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Took 0.54 seconds to deallocate network for instance. [ 1447.050241] env[67820]: INFO nova.scheduler.client.report [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Deleted allocations for instance cfc7ee69-6da9-4f70-b245-17b12674feeb [ 1447.070493] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b5ba0ff4-6412-4d3c-bc1f-a324113cceaf tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 641.020s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.071655] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 444.392s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.071872] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "cfc7ee69-6da9-4f70-b245-17b12674feeb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1447.072089] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.072260] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.074856] env[67820]: INFO nova.compute.manager [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Terminating instance [ 1447.076608] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquiring lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1447.076775] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Acquired lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1447.076943] env[67820]: DEBUG nova.network.neutron [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1447.087198] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1447.100590] env[67820]: DEBUG nova.network.neutron [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1447.143727] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1447.144019] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.145584] env[67820]: INFO nova.compute.claims [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1447.226678] env[67820]: DEBUG nova.network.neutron [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1447.240469] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Releasing lock "refresh_cache-cfc7ee69-6da9-4f70-b245-17b12674feeb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1447.241225] env[67820]: DEBUG nova.compute.manager [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1447.241434] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1447.241984] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3e6e9a72-9e34-42ac-8afa-765890dd20a8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.251187] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8716682b-1a2a-4226-9600-2b2a6408dbbb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.283040] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cfc7ee69-6da9-4f70-b245-17b12674feeb could not be found. [ 1447.283250] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1447.283427] env[67820]: INFO nova.compute.manager [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1447.283701] env[67820]: DEBUG oslo.service.loopingcall [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1447.286169] env[67820]: DEBUG nova.compute.manager [-] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1447.286260] env[67820]: DEBUG nova.network.neutron [-] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1447.304710] env[67820]: DEBUG nova.network.neutron [-] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1447.313287] env[67820]: DEBUG nova.network.neutron [-] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1447.324236] env[67820]: INFO nova.compute.manager [-] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] Took 0.04 seconds to deallocate network for instance. [ 1447.439664] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e535b908-fc1c-4368-9cba-c65e87943576 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.447623] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb4eb7ed-cca5-458d-b763-5aa0850ba0e7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.454236] env[67820]: DEBUG oslo_concurrency.lockutils [None req-485b6dd5-ece6-4e75-ab75-27ac63052061 tempest-AttachVolumeShelveTestJSON-1475705081 tempest-AttachVolumeShelveTestJSON-1475705081-project-member] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.383s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.456083] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 307.850s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1447.456353] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: cfc7ee69-6da9-4f70-b245-17b12674feeb] During sync_power_state the instance has a pending task (spawning). Skip. [ 1447.456603] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "cfc7ee69-6da9-4f70-b245-17b12674feeb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.002s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.483339] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08c53763-fe24-4bbf-9c51-fb928e1857fe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.490930] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acd6f093-f342-4df5-bc9f-6febaf9a54e8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.504178] env[67820]: DEBUG nova.compute.provider_tree [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1447.512423] env[67820]: DEBUG nova.scheduler.client.report [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1447.526441] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1447.526950] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1447.560490] env[67820]: DEBUG nova.compute.utils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1447.561889] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1447.562015] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1447.570204] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1447.617261] env[67820]: DEBUG nova.policy [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bb006fa217a9496f819f6d98acbd9c23', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98bd16e535c84bcd932ef0a99d723cc2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1447.631182] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1447.656129] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1447.656357] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1447.656513] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1447.656729] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1447.656884] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1447.657039] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1447.657249] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1447.657408] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1447.657568] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1447.657729] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1447.657896] env[67820]: DEBUG nova.virt.hardware [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1447.658735] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2990ac42-2324-4641-b764-3f035bf8533f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1447.666742] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a8779b6-0e52-47d3-806f-e61952f8a800 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.013230] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Successfully created port: 8533c12e-b514-4ef6-beca-ccae2a9d3e5b {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1448.661439] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Successfully updated port: 8533c12e-b514-4ef6-beca-ccae2a9d3e5b {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1448.674971] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "refresh_cache-e401a9ad-d6ed-4511-936c-4cf36d41281b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1448.675261] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired lock "refresh_cache-e401a9ad-d6ed-4511-936c-4cf36d41281b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1448.675457] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1448.713082] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1448.888772] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Updating instance_info_cache with network_info: [{"id": "8533c12e-b514-4ef6-beca-ccae2a9d3e5b", "address": "fa:16:3e:9c:5b:de", "network": {"id": "d4b6e4dc-1215-4b9c-bfcd-c25045102f1c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-844337455-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98bd16e535c84bcd932ef0a99d723cc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a485857d-7086-4dcf-9d65-d0dcd177fcb0", "external-id": "nsx-vlan-transportzone-232", "segmentation_id": 232, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8533c12e-b5", "ovs_interfaceid": "8533c12e-b514-4ef6-beca-ccae2a9d3e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1448.899595] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Releasing lock "refresh_cache-e401a9ad-d6ed-4511-936c-4cf36d41281b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1448.899900] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Instance network_info: |[{"id": "8533c12e-b514-4ef6-beca-ccae2a9d3e5b", "address": "fa:16:3e:9c:5b:de", "network": {"id": "d4b6e4dc-1215-4b9c-bfcd-c25045102f1c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-844337455-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98bd16e535c84bcd932ef0a99d723cc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a485857d-7086-4dcf-9d65-d0dcd177fcb0", "external-id": "nsx-vlan-transportzone-232", "segmentation_id": 232, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8533c12e-b5", "ovs_interfaceid": "8533c12e-b514-4ef6-beca-ccae2a9d3e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1448.900285] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9c:5b:de', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a485857d-7086-4dcf-9d65-d0dcd177fcb0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8533c12e-b514-4ef6-beca-ccae2a9d3e5b', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1448.908176] env[67820]: DEBUG oslo.service.loopingcall [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1448.908593] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1448.908816] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9b1a50aa-bff7-486d-9801-49dc25a86f82 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1448.929666] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1448.929666] env[67820]: value = "task-3467408" [ 1448.929666] env[67820]: _type = "Task" [ 1448.929666] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1448.937946] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467408, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1448.976960] env[67820]: DEBUG nova.compute.manager [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Received event network-vif-plugged-8533c12e-b514-4ef6-beca-ccae2a9d3e5b {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1448.977186] env[67820]: DEBUG oslo_concurrency.lockutils [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] Acquiring lock "e401a9ad-d6ed-4511-936c-4cf36d41281b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1448.977428] env[67820]: DEBUG oslo_concurrency.lockutils [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1448.977605] env[67820]: DEBUG oslo_concurrency.lockutils [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1448.977798] env[67820]: DEBUG nova.compute.manager [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] No waiting events found dispatching network-vif-plugged-8533c12e-b514-4ef6-beca-ccae2a9d3e5b {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1448.977986] env[67820]: WARNING nova.compute.manager [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Received unexpected event network-vif-plugged-8533c12e-b514-4ef6-beca-ccae2a9d3e5b for instance with vm_state building and task_state spawning. [ 1448.978166] env[67820]: DEBUG nova.compute.manager [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Received event network-changed-8533c12e-b514-4ef6-beca-ccae2a9d3e5b {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1448.978344] env[67820]: DEBUG nova.compute.manager [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Refreshing instance network info cache due to event network-changed-8533c12e-b514-4ef6-beca-ccae2a9d3e5b. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1448.978537] env[67820]: DEBUG oslo_concurrency.lockutils [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] Acquiring lock "refresh_cache-e401a9ad-d6ed-4511-936c-4cf36d41281b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1448.978685] env[67820]: DEBUG oslo_concurrency.lockutils [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] Acquired lock "refresh_cache-e401a9ad-d6ed-4511-936c-4cf36d41281b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1448.978860] env[67820]: DEBUG nova.network.neutron [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Refreshing network info cache for port 8533c12e-b514-4ef6-beca-ccae2a9d3e5b {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1449.232610] env[67820]: DEBUG nova.network.neutron [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Updated VIF entry in instance network info cache for port 8533c12e-b514-4ef6-beca-ccae2a9d3e5b. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1449.233009] env[67820]: DEBUG nova.network.neutron [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Updating instance_info_cache with network_info: [{"id": "8533c12e-b514-4ef6-beca-ccae2a9d3e5b", "address": "fa:16:3e:9c:5b:de", "network": {"id": "d4b6e4dc-1215-4b9c-bfcd-c25045102f1c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-844337455-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98bd16e535c84bcd932ef0a99d723cc2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a485857d-7086-4dcf-9d65-d0dcd177fcb0", "external-id": "nsx-vlan-transportzone-232", "segmentation_id": 232, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8533c12e-b5", "ovs_interfaceid": "8533c12e-b514-4ef6-beca-ccae2a9d3e5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1449.242171] env[67820]: DEBUG oslo_concurrency.lockutils [req-eee2c7d0-d67b-48ed-af14-9e33d074e01a req-a71e2496-87e6-48c2-8a4c-cbbc2def1057 service nova] Releasing lock "refresh_cache-e401a9ad-d6ed-4511-936c-4cf36d41281b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1449.440115] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467408, 'name': CreateVM_Task, 'duration_secs': 0.279008} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1449.440248] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1449.440887] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1449.441067] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1449.441377] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1449.441640] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1be98e6b-0fae-4511-a9d3-ab3b4998a1f4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1449.445823] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1449.445823] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52e00583-d5de-7c86-f844-9d827195d77c" [ 1449.445823] env[67820]: _type = "Task" [ 1449.445823] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1449.452913] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52e00583-d5de-7c86-f844-9d827195d77c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1449.956957] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1449.957250] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1449.957452] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1455.610095] env[67820]: DEBUG oslo_concurrency.lockutils [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1484.615962] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1485.690808] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquiring lock "a84c5537-9ad1-44d6-b732-fda1156bff86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1485.691128] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1486.621591] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.622541] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.622824] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1490.622069] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.622423] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1490.622423] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1490.644051] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.644233] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.644367] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.644489] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.644609] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.644726] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.644841] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.644977] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.645140] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.645261] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1490.645379] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1490.645843] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.646024] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1492.640911] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1493.621997] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1493.806889] env[67820]: WARNING oslo_vmware.rw_handles [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1493.806889] env[67820]: ERROR oslo_vmware.rw_handles [ 1493.807492] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1493.809597] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1493.809597] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Copying Virtual Disk [datastore1] vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/4c2f48da-af9f-409c-9a2f-03174295696c/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1493.809831] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1793c404-60a9-4ef7-aa49-cbffd81ab71c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1493.819723] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1493.819723] env[67820]: value = "task-3467419" [ 1493.819723] env[67820]: _type = "Task" [ 1493.819723] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1493.829316] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': task-3467419, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1494.330466] env[67820]: DEBUG oslo_vmware.exceptions [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1494.330765] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1494.331355] env[67820]: ERROR nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1494.331355] env[67820]: Faults: ['InvalidArgument'] [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] Traceback (most recent call last): [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] yield resources [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self.driver.spawn(context, instance, image_meta, [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self._fetch_image_if_missing(context, vi) [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] image_cache(vi, tmp_image_ds_loc) [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] vm_util.copy_virtual_disk( [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] session._wait_for_task(vmdk_copy_task) [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] return self.wait_for_task(task_ref) [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] return evt.wait() [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] result = hub.switch() [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] return self.greenlet.switch() [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self.f(*self.args, **self.kw) [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] raise exceptions.translate_fault(task_info.error) [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] Faults: ['InvalidArgument'] [ 1494.331355] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] [ 1494.332368] env[67820]: INFO nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Terminating instance [ 1494.333223] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1494.333432] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1494.334073] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1494.334273] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1494.334498] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-485bb00b-7f60-4915-8471-2ce75ff1b6ff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.336726] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfa58951-1eb3-47bf-a766-7ad8cd5adbad {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.343870] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1494.344129] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7368b8e6-46b7-441a-8c24-2a526a63d0b3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.346390] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1494.346563] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1494.347541] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ef972a36-005d-4b19-b951-a23e73c409f5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.352832] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1494.352832] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5228fef0-94fc-7d15-4ade-90484ce23296" [ 1494.352832] env[67820]: _type = "Task" [ 1494.352832] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1494.360552] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5228fef0-94fc-7d15-4ade-90484ce23296, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1494.419260] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1494.419496] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1494.419675] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Deleting the datastore file [datastore1] 45a68888-979e-4255-98a0-bcb289f57830 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1494.419949] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0d21d7b0-24d7-4297-b2b3-bdd311109bf1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.427288] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1494.427288] env[67820]: value = "task-3467421" [ 1494.427288] env[67820]: _type = "Task" [ 1494.427288] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1494.436519] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': task-3467421, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1494.621336] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1494.621637] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1494.636144] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1494.636334] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1494.639320] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1494.639320] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1494.639320] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4eda4e0-15ab-44d6-919a-8f2616072e5e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.649052] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00712c82-ad28-4138-afb7-b329282d5136 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.664085] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62510f5f-c30f-4d5d-b162-7a2ba35cd8bb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.671113] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecd09b96-89ee-4c29-a330-896e31b9246d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.701847] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180944MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1494.702339] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1494.702339] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1494.778244] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 45a68888-979e-4255-98a0-bcb289f57830 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.778454] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.778593] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.778746] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.778907] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.779064] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.779220] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.779458] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.779623] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.779744] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1494.791195] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.801791] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.812769] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e304b657-1f29-46e5-9f52-8809f8b29606 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.822844] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.832560] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance abcea602-a4fc-4dea-9261-a0111db20f84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.842896] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0e0e1852-25a6-42dd-9d5a-08af14e6423a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.852745] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a84c5537-9ad1-44d6-b732-fda1156bff86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1494.853186] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1494.853412] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1494.867522] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1494.867779] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating directory with path [datastore1] vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1494.868017] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bb66501e-a56a-4b3f-94bd-13e602f8a0c1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.883021] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Created directory with path [datastore1] vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1494.883230] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Fetch image to [datastore1] vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1494.883400] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1494.886390] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48889376-e79f-4e79-919d-0a0d13913e6e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.894128] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cca34f9-2ad4-4522-ac13-af373138e51f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.905525] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0508c3eb-23a8-40d0-ba74-1046420e1926 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.944442] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4b56b91-897e-4ce3-a656-b3a3def77318 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.953851] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-211e0814-e0d3-4ea0-8f62-e8f32a570a6a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1494.955657] env[67820]: DEBUG oslo_vmware.api [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': task-3467421, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063953} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1494.958014] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1494.958221] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1494.958408] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1494.958582] env[67820]: INFO nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1494.960772] env[67820]: DEBUG nova.compute.claims [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1494.960888] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1494.982296] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1495.033547] env[67820]: DEBUG oslo_vmware.rw_handles [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1495.091738] env[67820]: DEBUG oslo_vmware.rw_handles [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1495.091935] env[67820]: DEBUG oslo_vmware.rw_handles [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1495.142033] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9765e36-0540-4a86-97f0-700c03bc66d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.150302] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c4bcf66-9167-4ffb-bb1b-0adee3c18fe8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.181473] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa27cc38-475c-423f-8b69-ca673eed5f06 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.188585] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c74442e2-d810-4727-9af7-82a458de43f3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.201423] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1495.210706] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1495.224582] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1495.224764] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.523s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1495.225102] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.264s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1495.449915] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77cc8b7f-9f26-4733-815b-a687f5ecfe95 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.457417] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-179f3982-eee0-457b-8ae9-64d77733c78c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.487122] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ebb054b-3aad-430e-924e-3a640909127f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.493832] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfa63c84-137d-4178-b05b-94a0289b8f35 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1495.507392] env[67820]: DEBUG nova.compute.provider_tree [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1495.516674] env[67820]: DEBUG nova.scheduler.client.report [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1495.529841] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.305s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1495.530403] env[67820]: ERROR nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1495.530403] env[67820]: Faults: ['InvalidArgument'] [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] Traceback (most recent call last): [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self.driver.spawn(context, instance, image_meta, [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self._fetch_image_if_missing(context, vi) [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] image_cache(vi, tmp_image_ds_loc) [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] vm_util.copy_virtual_disk( [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] session._wait_for_task(vmdk_copy_task) [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] return self.wait_for_task(task_ref) [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] return evt.wait() [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] result = hub.switch() [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] return self.greenlet.switch() [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] self.f(*self.args, **self.kw) [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] raise exceptions.translate_fault(task_info.error) [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] Faults: ['InvalidArgument'] [ 1495.530403] env[67820]: ERROR nova.compute.manager [instance: 45a68888-979e-4255-98a0-bcb289f57830] [ 1495.531354] env[67820]: DEBUG nova.compute.utils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1495.532620] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Build of instance 45a68888-979e-4255-98a0-bcb289f57830 was re-scheduled: A specified parameter was not correct: fileType [ 1495.532620] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1495.532992] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1495.533203] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1495.533387] env[67820]: DEBUG nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1495.533551] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1496.104438] env[67820]: DEBUG nova.network.neutron [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1496.120233] env[67820]: INFO nova.compute.manager [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Took 0.58 seconds to deallocate network for instance. [ 1496.217640] env[67820]: INFO nova.scheduler.client.report [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Deleted allocations for instance 45a68888-979e-4255-98a0-bcb289f57830 [ 1496.237890] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4afb6e6c-c664-4788-8de1-faba0fe93094 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "45a68888-979e-4255-98a0-bcb289f57830" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.292s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.239472] env[67820]: DEBUG oslo_concurrency.lockutils [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "45a68888-979e-4255-98a0-bcb289f57830" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.418s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.239472] env[67820]: DEBUG oslo_concurrency.lockutils [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "45a68888-979e-4255-98a0-bcb289f57830-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1496.239646] env[67820]: DEBUG oslo_concurrency.lockutils [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "45a68888-979e-4255-98a0-bcb289f57830-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.239693] env[67820]: DEBUG oslo_concurrency.lockutils [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "45a68888-979e-4255-98a0-bcb289f57830-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.241761] env[67820]: INFO nova.compute.manager [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Terminating instance [ 1496.244716] env[67820]: DEBUG nova.compute.manager [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1496.244716] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1496.244716] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-95524821-3425-41ee-a1da-d8cece44784d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.253142] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5c89c43-bd09-4434-a9ab-47549082620f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.264513] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1496.285875] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 45a68888-979e-4255-98a0-bcb289f57830 could not be found. [ 1496.286095] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1496.286272] env[67820]: INFO nova.compute.manager [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1496.286517] env[67820]: DEBUG oslo.service.loopingcall [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1496.286750] env[67820]: DEBUG nova.compute.manager [-] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1496.286841] env[67820]: DEBUG nova.network.neutron [-] [instance: 45a68888-979e-4255-98a0-bcb289f57830] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1496.310181] env[67820]: DEBUG nova.network.neutron [-] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1496.320313] env[67820]: INFO nova.compute.manager [-] [instance: 45a68888-979e-4255-98a0-bcb289f57830] Took 0.03 seconds to deallocate network for instance. [ 1496.327468] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1496.327697] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.329070] env[67820]: INFO nova.compute.claims [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1496.401452] env[67820]: DEBUG oslo_concurrency.lockutils [None req-75702cba-f864-44f1-948d-1a0804803be7 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "45a68888-979e-4255-98a0-bcb289f57830" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.402389] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "45a68888-979e-4255-98a0-bcb289f57830" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 356.797s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1496.402592] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 45a68888-979e-4255-98a0-bcb289f57830] During sync_power_state the instance has a pending task (deleting). Skip. [ 1496.402768] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "45a68888-979e-4255-98a0-bcb289f57830" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.546018] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d50de07-a32a-4cbc-ac9b-10584baae267 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.553579] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a6f3899-0e50-47ed-bb82-6ad75d0c8092 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.582355] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47dc18eb-e310-4a2d-8c0a-6e618be9aefc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.589616] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ac411a9-e932-4185-889d-a50a11bcdfde {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.602454] env[67820]: DEBUG nova.compute.provider_tree [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1496.612023] env[67820]: DEBUG nova.scheduler.client.report [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1496.625791] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.298s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1496.626260] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1496.657428] env[67820]: DEBUG nova.compute.utils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1496.658990] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1496.659189] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1496.667758] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1496.732131] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1496.755963] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1496.756217] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1496.756374] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1496.756553] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1496.756699] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1496.756845] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1496.757240] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1496.757346] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1496.757519] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1496.757683] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1496.757852] env[67820]: DEBUG nova.virt.hardware [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1496.758702] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3446493d-483f-44f8-b7b1-7ec906f175ad {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1496.762359] env[67820]: DEBUG nova.policy [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1fdcd371f66742e2b8a56846e91e62aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7767a564247b405b92073629bffda753', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1496.769205] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0828ffe3-d501-4515-8aed-5e3850e670c0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1497.175140] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Successfully created port: cfe5a7ce-220b-4427-b00e-ac725a8a99bc {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1497.762151] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Successfully updated port: cfe5a7ce-220b-4427-b00e-ac725a8a99bc {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1497.775210] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "refresh_cache-bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1497.775379] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "refresh_cache-bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1497.775531] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1497.813508] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1498.027813] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Updating instance_info_cache with network_info: [{"id": "cfe5a7ce-220b-4427-b00e-ac725a8a99bc", "address": "fa:16:3e:46:99:18", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfe5a7ce-22", "ovs_interfaceid": "cfe5a7ce-220b-4427-b00e-ac725a8a99bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1498.040553] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "refresh_cache-bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1498.040940] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Instance network_info: |[{"id": "cfe5a7ce-220b-4427-b00e-ac725a8a99bc", "address": "fa:16:3e:46:99:18", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfe5a7ce-22", "ovs_interfaceid": "cfe5a7ce-220b-4427-b00e-ac725a8a99bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1498.041418] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:99:18', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '40c947c4-f471-4d48-8e43-fee54198107e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cfe5a7ce-220b-4427-b00e-ac725a8a99bc', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1498.049892] env[67820]: DEBUG oslo.service.loopingcall [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1498.050420] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1498.050658] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-402739c0-4faf-45e1-b4de-4755adfcda43 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1498.072164] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1498.072164] env[67820]: value = "task-3467422" [ 1498.072164] env[67820]: _type = "Task" [ 1498.072164] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1498.081339] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467422, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1498.165552] env[67820]: DEBUG nova.compute.manager [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Received event network-vif-plugged-cfe5a7ce-220b-4427-b00e-ac725a8a99bc {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1498.165780] env[67820]: DEBUG oslo_concurrency.lockutils [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] Acquiring lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1498.165990] env[67820]: DEBUG oslo_concurrency.lockutils [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1498.166173] env[67820]: DEBUG oslo_concurrency.lockutils [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1498.166341] env[67820]: DEBUG nova.compute.manager [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] No waiting events found dispatching network-vif-plugged-cfe5a7ce-220b-4427-b00e-ac725a8a99bc {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1498.166498] env[67820]: WARNING nova.compute.manager [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Received unexpected event network-vif-plugged-cfe5a7ce-220b-4427-b00e-ac725a8a99bc for instance with vm_state building and task_state spawning. [ 1498.166650] env[67820]: DEBUG nova.compute.manager [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Received event network-changed-cfe5a7ce-220b-4427-b00e-ac725a8a99bc {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1498.166824] env[67820]: DEBUG nova.compute.manager [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Refreshing instance network info cache due to event network-changed-cfe5a7ce-220b-4427-b00e-ac725a8a99bc. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1498.166967] env[67820]: DEBUG oslo_concurrency.lockutils [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] Acquiring lock "refresh_cache-bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1498.167114] env[67820]: DEBUG oslo_concurrency.lockutils [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] Acquired lock "refresh_cache-bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1498.167268] env[67820]: DEBUG nova.network.neutron [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Refreshing network info cache for port cfe5a7ce-220b-4427-b00e-ac725a8a99bc {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1498.521935] env[67820]: DEBUG nova.network.neutron [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Updated VIF entry in instance network info cache for port cfe5a7ce-220b-4427-b00e-ac725a8a99bc. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1498.522359] env[67820]: DEBUG nova.network.neutron [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Updating instance_info_cache with network_info: [{"id": "cfe5a7ce-220b-4427-b00e-ac725a8a99bc", "address": "fa:16:3e:46:99:18", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfe5a7ce-22", "ovs_interfaceid": "cfe5a7ce-220b-4427-b00e-ac725a8a99bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1498.531809] env[67820]: DEBUG oslo_concurrency.lockutils [req-90a68132-5626-4f19-bc99-116d374b7e08 req-99f3367e-00e7-4a64-a34a-5208024576db service nova] Releasing lock "refresh_cache-bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1498.581547] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467422, 'name': CreateVM_Task} progress is 99%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1499.082683] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467422, 'name': CreateVM_Task} progress is 99%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1499.583867] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467422, 'name': CreateVM_Task, 'duration_secs': 1.292824} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1499.584165] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1499.584800] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1499.585045] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1499.585510] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1499.585856] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1c62034-21ec-4eda-83b7-d47ffd63292f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1499.591117] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1499.591117] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5270958a-f939-9a79-b4cf-8c37d6a18272" [ 1499.591117] env[67820]: _type = "Task" [ 1499.591117] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1499.600536] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5270958a-f939-9a79-b4cf-8c37d6a18272, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1500.102926] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1500.103257] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1500.103508] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1500.106507] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1500.106721] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1500.463474] env[67820]: DEBUG oslo_concurrency.lockutils [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1543.827322] env[67820]: WARNING oslo_vmware.rw_handles [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1543.827322] env[67820]: ERROR oslo_vmware.rw_handles [ 1543.827924] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1543.829744] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1543.830032] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Copying Virtual Disk [datastore1] vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/0bc9883e-a8bc-4b90-82ca-5b1b370bfbf8/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1543.830353] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-00804416-d66e-4b63-9d99-99ce2fdcf550 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1543.842995] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1543.842995] env[67820]: value = "task-3467423" [ 1543.842995] env[67820]: _type = "Task" [ 1543.842995] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1543.850993] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': task-3467423, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1544.353959] env[67820]: DEBUG oslo_vmware.exceptions [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1544.354275] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1544.354853] env[67820]: ERROR nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1544.354853] env[67820]: Faults: ['InvalidArgument'] [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Traceback (most recent call last): [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] yield resources [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self.driver.spawn(context, instance, image_meta, [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self._fetch_image_if_missing(context, vi) [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] image_cache(vi, tmp_image_ds_loc) [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] vm_util.copy_virtual_disk( [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] session._wait_for_task(vmdk_copy_task) [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] return self.wait_for_task(task_ref) [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] return evt.wait() [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] result = hub.switch() [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] return self.greenlet.switch() [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self.f(*self.args, **self.kw) [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] raise exceptions.translate_fault(task_info.error) [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Faults: ['InvalidArgument'] [ 1544.354853] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] [ 1544.355746] env[67820]: INFO nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Terminating instance [ 1544.356883] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1544.356991] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1544.357254] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-522740e6-44be-4eca-adcf-c323c774ecfc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.359575] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1544.359769] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1544.360508] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49331f41-2653-418a-9a2c-a35e470e5171 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.367108] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1544.367360] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-757f0938-3ab1-4f3a-8eb4-10c114db3805 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.369390] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1544.369563] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1544.370489] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dd58e10c-33e1-4578-92c3-e4a90088644f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.375629] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Waiting for the task: (returnval){ [ 1544.375629] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5214265d-a017-b73b-7667-bdde71149406" [ 1544.375629] env[67820]: _type = "Task" [ 1544.375629] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1544.387941] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5214265d-a017-b73b-7667-bdde71149406, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1544.431421] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1544.431813] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1544.432057] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Deleting the datastore file [datastore1] 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1544.432331] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bd1cba9c-e6de-440d-abab-435265247b25 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.438378] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1544.438378] env[67820]: value = "task-3467425" [ 1544.438378] env[67820]: _type = "Task" [ 1544.438378] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1544.446717] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': task-3467425, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1544.885731] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1544.886049] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Creating directory with path [datastore1] vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1544.886207] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-42b1afd5-829a-482d-8f5c-85258b88100d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.896966] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Created directory with path [datastore1] vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1544.897178] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Fetch image to [datastore1] vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1544.897348] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1544.898106] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d774335-dae0-4e49-a07a-0a915af3ad9a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.904823] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e34c7b41-a58b-4c9b-8db2-524611ee1a05 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.914366] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d44e454-a25d-40b3-9ffb-e6a64b5d14db {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.950820] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c18bfe13-72d3-49c4-8df2-af2bce7c4a4e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.958239] env[67820]: DEBUG oslo_vmware.api [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': task-3467425, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064842} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1544.959666] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1544.959862] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1544.960046] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1544.960229] env[67820]: INFO nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1544.961988] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-75120b94-2813-4cc3-a5d4-feed47d5e77b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1544.964112] env[67820]: DEBUG nova.compute.claims [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1544.964376] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1544.964542] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1544.990469] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1545.042705] env[67820]: DEBUG oslo_vmware.rw_handles [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1545.102491] env[67820]: DEBUG oslo_vmware.rw_handles [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1545.102684] env[67820]: DEBUG oslo_vmware.rw_handles [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1545.285539] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72b83327-ffe0-4f94-af86-cc63a5920c0c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.293270] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-180227da-1a66-4dca-b129-cf879dc69761 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.324139] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1698c2e4-f094-42ad-a26b-84aa9c4a4f38 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.332035] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c8ee0ef-15d3-4a76-a553-2eedcaa2b503 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.345549] env[67820]: DEBUG nova.compute.provider_tree [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1545.354122] env[67820]: DEBUG nova.scheduler.client.report [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1545.369750] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.405s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.370303] env[67820]: ERROR nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1545.370303] env[67820]: Faults: ['InvalidArgument'] [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Traceback (most recent call last): [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self.driver.spawn(context, instance, image_meta, [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self._fetch_image_if_missing(context, vi) [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] image_cache(vi, tmp_image_ds_loc) [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] vm_util.copy_virtual_disk( [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] session._wait_for_task(vmdk_copy_task) [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] return self.wait_for_task(task_ref) [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] return evt.wait() [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] result = hub.switch() [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] return self.greenlet.switch() [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] self.f(*self.args, **self.kw) [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] raise exceptions.translate_fault(task_info.error) [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Faults: ['InvalidArgument'] [ 1545.370303] env[67820]: ERROR nova.compute.manager [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] [ 1545.371184] env[67820]: DEBUG nova.compute.utils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1545.372480] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Build of instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 was re-scheduled: A specified parameter was not correct: fileType [ 1545.372480] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1545.372851] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1545.373042] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1545.373219] env[67820]: DEBUG nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1545.373382] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1545.746391] env[67820]: DEBUG nova.network.neutron [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1545.760337] env[67820]: INFO nova.compute.manager [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Took 0.39 seconds to deallocate network for instance. [ 1545.849745] env[67820]: INFO nova.scheduler.client.report [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Deleted allocations for instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 [ 1545.873231] env[67820]: DEBUG oslo_concurrency.lockutils [None req-459fa0a3-1169-4b71-88b5-0c7eacfdba56 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 634.785s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.874525] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.661s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1545.874730] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1545.875030] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1545.875246] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1545.877515] env[67820]: INFO nova.compute.manager [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Terminating instance [ 1545.881312] env[67820]: DEBUG nova.compute.manager [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1545.881545] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1545.881790] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-89576476-a892-4eeb-a355-4ef4a798dc3f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.885425] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1545.892046] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d4d77be-7e53-4f75-b6aa-1dddb7fd627c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1545.921042] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822 could not be found. [ 1545.921042] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1545.921447] env[67820]: INFO nova.compute.manager [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1545.921490] env[67820]: DEBUG oslo.service.loopingcall [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1545.922225] env[67820]: DEBUG nova.compute.manager [-] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1545.922320] env[67820]: DEBUG nova.network.neutron [-] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1545.939355] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1545.939596] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1545.941076] env[67820]: INFO nova.compute.claims [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1545.950758] env[67820]: DEBUG nova.network.neutron [-] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1545.972091] env[67820]: INFO nova.compute.manager [-] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] Took 0.05 seconds to deallocate network for instance. [ 1546.070087] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b34e92c1-d562-4182-ac36-53d9361a0783 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.196s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1546.072337] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 406.467s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1546.072707] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 04ddd9b2-85d3-4c6e-8021-f91d7c0f6822] During sync_power_state the instance has a pending task (deleting). Skip. [ 1546.072991] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "04ddd9b2-85d3-4c6e-8021-f91d7c0f6822" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1546.207125] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73779f4a-aef0-47d6-8a52-d137b4a49e74 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.214795] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0a59ae8-80a4-4f9d-bb8b-008d7bd72d1e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.244525] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5aed8c7-df5f-45ae-94d6-f5c806596769 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.251663] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02ce00b4-bccb-4a1d-9ced-7e73c095dd42 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.265780] env[67820]: DEBUG nova.compute.provider_tree [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1546.274533] env[67820]: DEBUG nova.scheduler.client.report [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1546.289338] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1546.289811] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1546.327871] env[67820]: DEBUG nova.compute.utils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1546.328972] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1546.329138] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1546.343771] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1546.424685] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1546.429072] env[67820]: DEBUG nova.policy [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70e0dc54fb0d421dbdd0a97f6f1c473c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d2d1a68ef76415584b4404eea3a8363', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1546.453414] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1546.453665] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1546.453825] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1546.454012] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1546.454233] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1546.454380] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1546.454654] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1546.454824] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1546.455033] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1546.455219] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1546.455392] env[67820]: DEBUG nova.virt.hardware [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1546.456237] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-975970e7-ad0b-42f2-943d-e9eb7d84dccf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.464375] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38af99eb-2ce2-4543-a048-a1acd13cea3e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1546.802147] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Successfully created port: 50bad736-fd34-4e71-a00b-6194697c2738 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1547.632136] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Successfully updated port: 50bad736-fd34-4e71-a00b-6194697c2738 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1547.643605] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "refresh_cache-0cda1de0-73dd-45dd-932b-75e59fb785cf" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1547.643810] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquired lock "refresh_cache-0cda1de0-73dd-45dd-932b-75e59fb785cf" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1547.643909] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1547.680890] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1547.784958] env[67820]: DEBUG nova.compute.manager [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Received event network-vif-plugged-50bad736-fd34-4e71-a00b-6194697c2738 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1547.785198] env[67820]: DEBUG oslo_concurrency.lockutils [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] Acquiring lock "0cda1de0-73dd-45dd-932b-75e59fb785cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1547.785528] env[67820]: DEBUG oslo_concurrency.lockutils [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1547.785727] env[67820]: DEBUG oslo_concurrency.lockutils [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1547.785966] env[67820]: DEBUG nova.compute.manager [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] No waiting events found dispatching network-vif-plugged-50bad736-fd34-4e71-a00b-6194697c2738 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1547.785966] env[67820]: WARNING nova.compute.manager [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Received unexpected event network-vif-plugged-50bad736-fd34-4e71-a00b-6194697c2738 for instance with vm_state building and task_state spawning. [ 1547.786284] env[67820]: DEBUG nova.compute.manager [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Received event network-changed-50bad736-fd34-4e71-a00b-6194697c2738 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1547.786496] env[67820]: DEBUG nova.compute.manager [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Refreshing instance network info cache due to event network-changed-50bad736-fd34-4e71-a00b-6194697c2738. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1547.786670] env[67820]: DEBUG oslo_concurrency.lockutils [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] Acquiring lock "refresh_cache-0cda1de0-73dd-45dd-932b-75e59fb785cf" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1547.866363] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Updating instance_info_cache with network_info: [{"id": "50bad736-fd34-4e71-a00b-6194697c2738", "address": "fa:16:3e:cf:0e:34", "network": {"id": "6972e393-2331-469b-8ada-007c145171ed", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-586972927-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d2d1a68ef76415584b4404eea3a8363", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50bad736-fd", "ovs_interfaceid": "50bad736-fd34-4e71-a00b-6194697c2738", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1547.880979] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Releasing lock "refresh_cache-0cda1de0-73dd-45dd-932b-75e59fb785cf" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1547.881300] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Instance network_info: |[{"id": "50bad736-fd34-4e71-a00b-6194697c2738", "address": "fa:16:3e:cf:0e:34", "network": {"id": "6972e393-2331-469b-8ada-007c145171ed", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-586972927-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d2d1a68ef76415584b4404eea3a8363", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50bad736-fd", "ovs_interfaceid": "50bad736-fd34-4e71-a00b-6194697c2738", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1547.881756] env[67820]: DEBUG oslo_concurrency.lockutils [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] Acquired lock "refresh_cache-0cda1de0-73dd-45dd-932b-75e59fb785cf" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1547.881849] env[67820]: DEBUG nova.network.neutron [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Refreshing network info cache for port 50bad736-fd34-4e71-a00b-6194697c2738 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1547.882926] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cf:0e:34', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '50bad736-fd34-4e71-a00b-6194697c2738', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1547.891105] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Creating folder: Project (1d2d1a68ef76415584b4404eea3a8363). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1547.892458] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5fd41b81-a1d1-4d3b-a42e-03216362320d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.906836] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Created folder: Project (1d2d1a68ef76415584b4404eea3a8363) in parent group-v692668. [ 1547.907078] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Creating folder: Instances. Parent ref: group-v692751. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1547.907351] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c27c0f82-c4db-4848-bf85-d4d369f2a7f4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.919036] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Created folder: Instances in parent group-v692751. [ 1547.919036] env[67820]: DEBUG oslo.service.loopingcall [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1547.919036] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1547.919036] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-693253c1-2d55-40c6-bf75-9ea95ef55b12 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1547.942272] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1547.942272] env[67820]: value = "task-3467428" [ 1547.942272] env[67820]: _type = "Task" [ 1547.942272] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1547.954060] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467428, 'name': CreateVM_Task} progress is 6%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1548.226953] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1548.235448] env[67820]: DEBUG nova.network.neutron [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Updated VIF entry in instance network info cache for port 50bad736-fd34-4e71-a00b-6194697c2738. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1548.235821] env[67820]: DEBUG nova.network.neutron [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Updating instance_info_cache with network_info: [{"id": "50bad736-fd34-4e71-a00b-6194697c2738", "address": "fa:16:3e:cf:0e:34", "network": {"id": "6972e393-2331-469b-8ada-007c145171ed", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-586972927-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d2d1a68ef76415584b4404eea3a8363", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac4015e0-e5e7-4b3f-8d8e-ef4501eea9aa", "external-id": "nsx-vlan-transportzone-132", "segmentation_id": 132, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50bad736-fd", "ovs_interfaceid": "50bad736-fd34-4e71-a00b-6194697c2738", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1548.245122] env[67820]: DEBUG oslo_concurrency.lockutils [req-2302e344-f884-47f6-bb1b-86d5948cc625 req-47f0d618-cb21-4fcd-8afd-b8af1f8897e3 service nova] Releasing lock "refresh_cache-0cda1de0-73dd-45dd-932b-75e59fb785cf" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1548.452331] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467428, 'name': CreateVM_Task, 'duration_secs': 0.300142} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1548.452493] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1548.453172] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1548.453340] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1548.453670] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1548.453920] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ceebbf07-8d0e-4a10-aae7-6abd2921fdc0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1548.458154] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Waiting for the task: (returnval){ [ 1548.458154] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]523968b8-94aa-acb1-84ae-5e47c8f6c990" [ 1548.458154] env[67820]: _type = "Task" [ 1548.458154] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1548.465325] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]523968b8-94aa-acb1-84ae-5e47c8f6c990, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1548.969148] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1548.969459] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1548.969627] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1549.622065] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1549.622065] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1552.617063] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.621486] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.621661] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1552.621785] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1552.643212] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.643212] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.643401] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.643465] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.643555] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.643678] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.643797] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.643916] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.644045] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.644175] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1552.644326] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1552.644865] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.645050] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1554.000820] env[67820]: DEBUG oslo_concurrency.lockutils [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1555.621557] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1555.621833] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1555.621977] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1555.636030] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1555.636030] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1555.636030] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1555.636030] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1555.636668] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b0f679c-75fe-46e6-8e6b-14d616981be5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.645176] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9741ff4f-98a2-48e3-a467-b7a54a07ca0f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.659736] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1a9c814-8932-441a-b55e-f38fa643d5d6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.666314] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-278339ee-b3e0-4db9-84f7-e462f3a1c193 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1555.695188] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180826MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1555.695369] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1555.695567] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1555.766780] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.766967] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.767110] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.767235] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.767355] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.767550] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.767689] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.767807] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.767922] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.768049] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1555.779201] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e304b657-1f29-46e5-9f52-8809f8b29606 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.789674] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.798858] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance abcea602-a4fc-4dea-9261-a0111db20f84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.808896] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0e0e1852-25a6-42dd-9d5a-08af14e6423a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.817327] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a84c5537-9ad1-44d6-b732-fda1156bff86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.827366] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1555.827586] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1555.827732] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1556.001431] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f8d0407-0ada-4200-b52d-68b9db2dbf7f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1556.009016] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6385f135-8b7b-46d3-b88e-2a71385523e1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1556.038635] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52e7c40b-c15b-424f-b889-3a668b7fe42c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1556.045961] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-527535f4-32bc-4001-83cf-088777415cc8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1556.059303] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1556.067490] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1556.080839] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1556.080958] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1570.304851] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1570.305307] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1589.686580] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1589.686876] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1594.974726] env[67820]: WARNING oslo_vmware.rw_handles [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1594.974726] env[67820]: ERROR oslo_vmware.rw_handles [ 1594.975379] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1594.977067] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1594.977327] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Copying Virtual Disk [datastore1] vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/36a0c0bf-be28-4091-b55b-7f720a377da4/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1594.977609] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e0d6310e-eba0-4850-8ba7-8ec11590162b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1594.986041] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Waiting for the task: (returnval){ [ 1594.986041] env[67820]: value = "task-3467429" [ 1594.986041] env[67820]: _type = "Task" [ 1594.986041] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1594.993573] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Task: {'id': task-3467429, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1595.496536] env[67820]: DEBUG oslo_vmware.exceptions [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1595.496822] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1595.497409] env[67820]: ERROR nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1595.497409] env[67820]: Faults: ['InvalidArgument'] [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Traceback (most recent call last): [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] yield resources [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self.driver.spawn(context, instance, image_meta, [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self._fetch_image_if_missing(context, vi) [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] image_cache(vi, tmp_image_ds_loc) [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] vm_util.copy_virtual_disk( [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] session._wait_for_task(vmdk_copy_task) [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] return self.wait_for_task(task_ref) [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] return evt.wait() [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] result = hub.switch() [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] return self.greenlet.switch() [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self.f(*self.args, **self.kw) [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] raise exceptions.translate_fault(task_info.error) [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Faults: ['InvalidArgument'] [ 1595.497409] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] [ 1595.498402] env[67820]: INFO nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Terminating instance [ 1595.499358] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1595.499544] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1595.500156] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1595.500340] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1595.500589] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b614eb01-1b0d-48e4-b35e-49faf2b304d9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.503180] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63b9ce71-c570-4ad0-a11e-5971e69b972e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.509800] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1595.509985] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-59544076-22f2-4635-b112-1d371d83d03d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.512149] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1595.512323] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1595.513399] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e8772684-7e75-4485-b073-f10425f1aa9c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.518168] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Waiting for the task: (returnval){ [ 1595.518168] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5296bb29-cf35-d5f7-fc9d-80a3228b42c2" [ 1595.518168] env[67820]: _type = "Task" [ 1595.518168] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1595.527884] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5296bb29-cf35-d5f7-fc9d-80a3228b42c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1595.591927] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1595.592182] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1595.592391] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Deleting the datastore file [datastore1] 9d6e6061-056f-4d2d-9860-22f154edc9ab {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1595.592695] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-82e15f24-8cfa-451d-970b-2f647056dc95 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1595.599527] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Waiting for the task: (returnval){ [ 1595.599527] env[67820]: value = "task-3467431" [ 1595.599527] env[67820]: _type = "Task" [ 1595.599527] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1595.607094] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Task: {'id': task-3467431, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1596.029282] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1596.029633] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Creating directory with path [datastore1] vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1596.029841] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-21c228c8-9716-4819-9f9c-2789aec5226e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.043671] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Created directory with path [datastore1] vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1596.043842] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Fetch image to [datastore1] vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1596.044021] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1596.044912] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f247f379-e1a1-4547-bba5-223dde92e502 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.052169] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08f77f86-e396-4af7-8371-5161b8229dad {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.061624] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74fb2fbd-19c9-41d0-a962-1d8127b75947 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.095273] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-145e82ee-b8a2-4aeb-ba15-ba336d6a7207 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.104321] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-87402b39-747a-44b3-a161-4661b42af969 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.110904] env[67820]: DEBUG oslo_vmware.api [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Task: {'id': task-3467431, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071728} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1596.111156] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1596.111342] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1596.111511] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1596.111783] env[67820]: INFO nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1596.113920] env[67820]: DEBUG nova.compute.claims [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1596.114108] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1596.114319] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1596.130274] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1596.187235] env[67820]: DEBUG oslo_vmware.rw_handles [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1596.248123] env[67820]: DEBUG oslo_vmware.rw_handles [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1596.248123] env[67820]: DEBUG oslo_vmware.rw_handles [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1596.450039] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c94f6df-d067-4938-9483-1e62512dff07 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.458486] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f287fa6-3522-452d-bc1e-b0cfc25a6675 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.488754] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f411538b-a772-4e31-a7e5-70fe13a0922c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.496626] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71c31ada-da4e-4e65-b3e9-9780e82149b4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1596.511871] env[67820]: DEBUG nova.compute.provider_tree [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1596.521491] env[67820]: DEBUG nova.scheduler.client.report [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1596.538815] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.424s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1596.539385] env[67820]: ERROR nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1596.539385] env[67820]: Faults: ['InvalidArgument'] [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Traceback (most recent call last): [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self.driver.spawn(context, instance, image_meta, [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self._fetch_image_if_missing(context, vi) [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] image_cache(vi, tmp_image_ds_loc) [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] vm_util.copy_virtual_disk( [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] session._wait_for_task(vmdk_copy_task) [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] return self.wait_for_task(task_ref) [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] return evt.wait() [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] result = hub.switch() [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] return self.greenlet.switch() [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] self.f(*self.args, **self.kw) [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] raise exceptions.translate_fault(task_info.error) [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Faults: ['InvalidArgument'] [ 1596.539385] env[67820]: ERROR nova.compute.manager [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] [ 1596.540121] env[67820]: DEBUG nova.compute.utils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1596.541663] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Build of instance 9d6e6061-056f-4d2d-9860-22f154edc9ab was re-scheduled: A specified parameter was not correct: fileType [ 1596.541663] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1596.542112] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1596.542251] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1596.542542] env[67820]: DEBUG nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1596.542583] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1596.991471] env[67820]: DEBUG nova.network.neutron [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1597.004916] env[67820]: INFO nova.compute.manager [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Took 0.46 seconds to deallocate network for instance. [ 1597.100445] env[67820]: INFO nova.scheduler.client.report [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Deleted allocations for instance 9d6e6061-056f-4d2d-9860-22f154edc9ab [ 1597.126642] env[67820]: DEBUG oslo_concurrency.lockutils [None req-23764395-4d05-42b5-801c-0fba83cae11f tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 635.991s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.127863] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.321s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.128113] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Acquiring lock "9d6e6061-056f-4d2d-9860-22f154edc9ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.128343] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.128521] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.131828] env[67820]: INFO nova.compute.manager [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Terminating instance [ 1597.133477] env[67820]: DEBUG nova.compute.manager [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1597.133676] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1597.133961] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c1dd6214-79b2-4636-919e-4fc4deaf503c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.139130] env[67820]: DEBUG nova.compute.manager [None req-c95813c5-5516-4f78-9c6b-a7b04cef4292 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: e304b657-1f29-46e5-9f52-8809f8b29606] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1597.145831] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62e4ef99-ec8d-47f7-841c-b72daaa2873a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.163760] env[67820]: DEBUG nova.compute.manager [None req-c95813c5-5516-4f78-9c6b-a7b04cef4292 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: e304b657-1f29-46e5-9f52-8809f8b29606] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1597.176593] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9d6e6061-056f-4d2d-9860-22f154edc9ab could not be found. [ 1597.176792] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1597.176975] env[67820]: INFO nova.compute.manager [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1597.177225] env[67820]: DEBUG oslo.service.loopingcall [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1597.177457] env[67820]: DEBUG nova.compute.manager [-] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1597.177554] env[67820]: DEBUG nova.network.neutron [-] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1597.193647] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c95813c5-5516-4f78-9c6b-a7b04cef4292 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "e304b657-1f29-46e5-9f52-8809f8b29606" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.185s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.201412] env[67820]: DEBUG nova.network.neutron [-] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1597.203465] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1597.209390] env[67820]: INFO nova.compute.manager [-] [instance: 9d6e6061-056f-4d2d-9860-22f154edc9ab] Took 0.03 seconds to deallocate network for instance. [ 1597.261021] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.261021] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1597.261429] env[67820]: INFO nova.compute.claims [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1597.294288] env[67820]: DEBUG oslo_concurrency.lockutils [None req-d8211370-a2df-4d34-b751-dea911badaf7 tempest-InstanceActionsTestJSON-1936862876 tempest-InstanceActionsTestJSON-1936862876-project-member] Lock "9d6e6061-056f-4d2d-9860-22f154edc9ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.494523] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ab950d7-83f9-470b-aef9-892e419dbbb4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.502528] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57bfe9cf-48b6-430e-9e51-b7b17ca77220 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.533142] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfbd8c59-16c7-4778-9fd2-22d528370e91 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.541132] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0adb02fb-1a2f-4151-8e96-54185014e2e9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.554896] env[67820]: DEBUG nova.compute.provider_tree [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1597.563738] env[67820]: DEBUG nova.scheduler.client.report [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1597.577490] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1597.577956] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1597.617532] env[67820]: DEBUG nova.compute.utils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1597.618841] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1597.619031] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1597.627511] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1597.698190] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1597.714195] env[67820]: DEBUG nova.policy [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'df43615850404e60b571c2ab5296519c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e17152dd1ce04f3dbcb729e8315f0006', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1597.726063] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1597.726411] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1597.726498] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1597.726613] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1597.726760] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1597.726904] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1597.727117] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1597.727278] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1597.727443] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1597.727607] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1597.727781] env[67820]: DEBUG nova.virt.hardware [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1597.728694] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a81376e0-eb8f-4aed-9ad7-7e7611bbd8a0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1597.737117] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-107dea00-a156-422e-a5b7-346c516921e0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1598.126201] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Successfully created port: 592aa22c-2331-419c-a04a-1bda7978248d {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1598.811442] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Successfully updated port: 592aa22c-2331-419c-a04a-1bda7978248d {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1598.831171] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "refresh_cache-99f872a5-2e7d-42b9-a94f-67153db8d0ad" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1598.831334] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired lock "refresh_cache-99f872a5-2e7d-42b9-a94f-67153db8d0ad" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1598.831487] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1598.890235] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1599.181603] env[67820]: DEBUG nova.compute.manager [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Received event network-vif-plugged-592aa22c-2331-419c-a04a-1bda7978248d {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1599.181859] env[67820]: DEBUG oslo_concurrency.lockutils [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] Acquiring lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1599.182041] env[67820]: DEBUG oslo_concurrency.lockutils [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1599.182214] env[67820]: DEBUG oslo_concurrency.lockutils [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1599.182357] env[67820]: DEBUG nova.compute.manager [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] No waiting events found dispatching network-vif-plugged-592aa22c-2331-419c-a04a-1bda7978248d {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1599.182520] env[67820]: WARNING nova.compute.manager [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Received unexpected event network-vif-plugged-592aa22c-2331-419c-a04a-1bda7978248d for instance with vm_state building and task_state spawning. [ 1599.182743] env[67820]: DEBUG nova.compute.manager [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Received event network-changed-592aa22c-2331-419c-a04a-1bda7978248d {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1599.182985] env[67820]: DEBUG nova.compute.manager [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Refreshing instance network info cache due to event network-changed-592aa22c-2331-419c-a04a-1bda7978248d. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1599.183205] env[67820]: DEBUG oslo_concurrency.lockutils [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] Acquiring lock "refresh_cache-99f872a5-2e7d-42b9-a94f-67153db8d0ad" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1599.294516] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Updating instance_info_cache with network_info: [{"id": "592aa22c-2331-419c-a04a-1bda7978248d", "address": "fa:16:3e:08:87:5c", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap592aa22c-23", "ovs_interfaceid": "592aa22c-2331-419c-a04a-1bda7978248d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1599.307514] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Releasing lock "refresh_cache-99f872a5-2e7d-42b9-a94f-67153db8d0ad" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1599.307783] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Instance network_info: |[{"id": "592aa22c-2331-419c-a04a-1bda7978248d", "address": "fa:16:3e:08:87:5c", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap592aa22c-23", "ovs_interfaceid": "592aa22c-2331-419c-a04a-1bda7978248d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1599.308077] env[67820]: DEBUG oslo_concurrency.lockutils [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] Acquired lock "refresh_cache-99f872a5-2e7d-42b9-a94f-67153db8d0ad" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1599.308258] env[67820]: DEBUG nova.network.neutron [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Refreshing network info cache for port 592aa22c-2331-419c-a04a-1bda7978248d {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1599.309528] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:08:87:5c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '418ddd3d-5f64-407e-8e0c-c8b81639bee9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '592aa22c-2331-419c-a04a-1bda7978248d', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1599.317074] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Creating folder: Project (e17152dd1ce04f3dbcb729e8315f0006). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1599.320013] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-00237172-5a1a-44a9-9866-02064e5b428b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.331058] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Created folder: Project (e17152dd1ce04f3dbcb729e8315f0006) in parent group-v692668. [ 1599.331058] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Creating folder: Instances. Parent ref: group-v692754. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1599.331221] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0606c258-46de-4fca-ad5b-40d02272ec8d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.339969] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Created folder: Instances in parent group-v692754. [ 1599.340317] env[67820]: DEBUG oslo.service.loopingcall [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1599.340501] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1599.340689] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-01891282-b503-4880-b207-00c5db660302 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1599.362061] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1599.362061] env[67820]: value = "task-3467434" [ 1599.362061] env[67820]: _type = "Task" [ 1599.362061] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1599.369120] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467434, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1599.575482] env[67820]: DEBUG nova.network.neutron [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Updated VIF entry in instance network info cache for port 592aa22c-2331-419c-a04a-1bda7978248d. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1599.575912] env[67820]: DEBUG nova.network.neutron [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Updating instance_info_cache with network_info: [{"id": "592aa22c-2331-419c-a04a-1bda7978248d", "address": "fa:16:3e:08:87:5c", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap592aa22c-23", "ovs_interfaceid": "592aa22c-2331-419c-a04a-1bda7978248d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1599.585083] env[67820]: DEBUG oslo_concurrency.lockutils [req-df3ac898-4237-4dc5-b62e-561e33fc59b1 req-7a3d79c6-31a5-4b96-beff-7f666b599317 service nova] Releasing lock "refresh_cache-99f872a5-2e7d-42b9-a94f-67153db8d0ad" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1599.701871] env[67820]: DEBUG oslo_concurrency.lockutils [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1599.871527] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467434, 'name': CreateVM_Task} progress is 25%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1600.372679] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467434, 'name': CreateVM_Task, 'duration_secs': 0.977454} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1600.373019] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1600.373508] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1600.373675] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1600.373989] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1600.374247] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-66387b90-226d-4ca0-8ed2-fe51a927ba6a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1600.378564] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for the task: (returnval){ [ 1600.378564] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5212192a-099a-6a93-f2ad-7bf668e1f618" [ 1600.378564] env[67820]: _type = "Task" [ 1600.378564] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1600.385750] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5212192a-099a-6a93-f2ad-7bf668e1f618, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1600.890929] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1600.891217] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1600.891430] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1605.076107] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1608.621355] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1610.622636] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1610.623037] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1612.617577] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1613.160752] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1613.160975] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1613.188261] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "2965c630-07c6-4e08-a5ab-4996d4c72b82" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1613.188497] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1613.621099] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.621599] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1614.621859] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1614.621904] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1614.645068] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.645330] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.645492] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.645621] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.646614] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.646814] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.646958] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.647101] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.647228] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.647349] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1614.647468] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1614.647972] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1615.621828] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1615.622112] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1615.633073] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1615.633299] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1615.633465] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1615.633622] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1615.634758] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd11c0d1-bc78-440d-8ead-a27133119d88 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.643614] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a74e046-31c3-4d7d-9abc-754b114f30d7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.657081] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c37c402-c126-4346-a47b-5b2913f151ab {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.663303] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbb6aec0-44de-4aab-b168-db6cebefb8b9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1615.692986] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180952MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1615.693157] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1615.693346] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1615.772795] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.772973] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.773115] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.773238] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.773358] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.773475] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.773593] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.773708] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.773822] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.774103] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1615.784965] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a84c5537-9ad1-44d6-b732-fda1156bff86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1615.794524] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1615.803733] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1615.813083] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1615.821846] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1615.830344] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1615.830558] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1615.830703] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1616.008110] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac7832f8-d577-4184-bce8-9b070000f580 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.015510] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b22fc0ca-a2eb-44dd-a52d-dba30cba29ee {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.044409] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66e9fda5-044b-4b52-a7c3-0bcd38a843dd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.051149] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97979df5-4731-40dd-929d-0d98f2198c89 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1616.064604] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1616.072504] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1616.086446] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1616.086446] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.393s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1618.085916] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1635.880205] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ea24370-27bd-47d7-aeca-5473b5a8b7a9 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "813af34e-49cc-40a9-a0d0-388a84fde493" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1635.880492] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ea24370-27bd-47d7-aeca-5473b5a8b7a9 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "813af34e-49cc-40a9-a0d0-388a84fde493" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.988093] env[67820]: WARNING oslo_vmware.rw_handles [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1644.988093] env[67820]: ERROR oslo_vmware.rw_handles [ 1644.988801] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1644.990451] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1644.990712] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Copying Virtual Disk [datastore1] vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/0412e1a0-ae61-4426-bced-65908b1580be/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1644.991025] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-338b318c-7006-464f-8d75-6b56be914c91 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.000935] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Waiting for the task: (returnval){ [ 1645.000935] env[67820]: value = "task-3467435" [ 1645.000935] env[67820]: _type = "Task" [ 1645.000935] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1645.009399] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Task: {'id': task-3467435, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1645.511685] env[67820]: DEBUG oslo_vmware.exceptions [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1645.511950] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1645.512530] env[67820]: ERROR nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1645.512530] env[67820]: Faults: ['InvalidArgument'] [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Traceback (most recent call last): [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] yield resources [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self.driver.spawn(context, instance, image_meta, [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self._fetch_image_if_missing(context, vi) [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] image_cache(vi, tmp_image_ds_loc) [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] vm_util.copy_virtual_disk( [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] session._wait_for_task(vmdk_copy_task) [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] return self.wait_for_task(task_ref) [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] return evt.wait() [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] result = hub.switch() [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] return self.greenlet.switch() [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self.f(*self.args, **self.kw) [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] raise exceptions.translate_fault(task_info.error) [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Faults: ['InvalidArgument'] [ 1645.512530] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] [ 1645.513957] env[67820]: INFO nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Terminating instance [ 1645.514502] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1645.514723] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1645.515017] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0cdde94c-f64c-4bb2-8406-9803956f3464 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.517514] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1645.517699] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1645.518443] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3a6e49b-3a43-44d4-9e27-be62579eb830 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.525114] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1645.525337] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c1007d81-760a-4e83-9171-3fb0c49c37f2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.528714] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1645.528890] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1645.529639] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a28f75f-608c-44d9-8354-5aa0327e7708 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.534704] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Waiting for the task: (returnval){ [ 1645.534704] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5247a629-2f01-f4f8-3aa0-8b711e5cbef8" [ 1645.534704] env[67820]: _type = "Task" [ 1645.534704] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1645.548304] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5247a629-2f01-f4f8-3aa0-8b711e5cbef8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1645.606430] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1645.606667] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1645.606822] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Deleting the datastore file [datastore1] 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1645.607130] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a14564b3-16d0-4ca1-91b9-c9456321263f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1645.613461] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Waiting for the task: (returnval){ [ 1645.613461] env[67820]: value = "task-3467437" [ 1645.613461] env[67820]: _type = "Task" [ 1645.613461] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1645.621422] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Task: {'id': task-3467437, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1646.046106] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1646.046456] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Creating directory with path [datastore1] vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1646.046554] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf8e6c85-bb56-441a-a88c-da07ed607054 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.058105] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Created directory with path [datastore1] vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1646.058325] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Fetch image to [datastore1] vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1646.058496] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1646.059302] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30f86ccf-d6fb-4d6e-ada5-9b7b5faed364 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.066366] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44345b0a-5832-4c9a-bb41-d50d095b5da8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.077200] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ce59ac3-c7c3-4756-b4b7-a3efebc97749 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.109073] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9244cb50-59d8-4d92-9635-4d02249d8f9d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.118376] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-489568ff-1064-42d3-93a4-1ace6d697850 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.125113] env[67820]: DEBUG oslo_vmware.api [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Task: {'id': task-3467437, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082991} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1646.125351] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1646.125526] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1646.125691] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1646.125862] env[67820]: INFO nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1646.128213] env[67820]: DEBUG nova.compute.claims [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1646.128391] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1646.128599] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1646.141448] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1646.193853] env[67820]: DEBUG oslo_vmware.rw_handles [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1646.255193] env[67820]: DEBUG oslo_vmware.rw_handles [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1646.255403] env[67820]: DEBUG oslo_vmware.rw_handles [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1646.444807] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05a33df4-a55d-4a43-924b-ece285d9b176 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.452862] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c71025d3-aa2c-4617-9b58-da93fb82f00c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.484249] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a10c804d-9856-43ca-8e67-1fdcc57b31a6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.491591] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5d55a5f-d675-4428-8f06-ad6ca04d0e3a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1646.504438] env[67820]: DEBUG nova.compute.provider_tree [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1646.515692] env[67820]: DEBUG nova.scheduler.client.report [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1646.529272] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.401s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.529810] env[67820]: ERROR nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1646.529810] env[67820]: Faults: ['InvalidArgument'] [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Traceback (most recent call last): [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self.driver.spawn(context, instance, image_meta, [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self._fetch_image_if_missing(context, vi) [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] image_cache(vi, tmp_image_ds_loc) [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] vm_util.copy_virtual_disk( [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] session._wait_for_task(vmdk_copy_task) [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] return self.wait_for_task(task_ref) [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] return evt.wait() [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] result = hub.switch() [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] return self.greenlet.switch() [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] self.f(*self.args, **self.kw) [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] raise exceptions.translate_fault(task_info.error) [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Faults: ['InvalidArgument'] [ 1646.529810] env[67820]: ERROR nova.compute.manager [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] [ 1646.530691] env[67820]: DEBUG nova.compute.utils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1646.532028] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Build of instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 was re-scheduled: A specified parameter was not correct: fileType [ 1646.532028] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1646.532393] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1646.532565] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1646.532732] env[67820]: DEBUG nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1646.532960] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1646.947405] env[67820]: DEBUG nova.network.neutron [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1646.961514] env[67820]: INFO nova.compute.manager [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Took 0.43 seconds to deallocate network for instance. [ 1647.055199] env[67820]: INFO nova.scheduler.client.report [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Deleted allocations for instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 [ 1647.075204] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ce6f699c-3e6a-4b0e-8ddd-4a2acdf27776 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 628.315s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1647.076827] env[67820]: DEBUG oslo_concurrency.lockutils [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 432.092s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1647.076827] env[67820]: DEBUG oslo_concurrency.lockutils [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Acquiring lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1647.076827] env[67820]: DEBUG oslo_concurrency.lockutils [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1647.077135] env[67820]: DEBUG oslo_concurrency.lockutils [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1647.078948] env[67820]: INFO nova.compute.manager [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Terminating instance [ 1647.081702] env[67820]: DEBUG nova.compute.manager [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1647.081787] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1647.082348] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3a817167-8ab3-4e7c-b713-4b7be6944462 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.092646] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cffd739-ecc9-4ef3-98f5-5bb558fcc823 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.103600] env[67820]: DEBUG nova.compute.manager [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: abcea602-a4fc-4dea-9261-a0111db20f84] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1647.124224] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3 could not be found. [ 1647.124430] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1647.124606] env[67820]: INFO nova.compute.manager [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1647.124848] env[67820]: DEBUG oslo.service.loopingcall [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1647.125119] env[67820]: DEBUG nova.compute.manager [-] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1647.125221] env[67820]: DEBUG nova.network.neutron [-] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1647.130307] env[67820]: DEBUG nova.compute.manager [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: abcea602-a4fc-4dea-9261-a0111db20f84] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1647.152616] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "abcea602-a4fc-4dea-9261-a0111db20f84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.526s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1647.155206] env[67820]: DEBUG nova.network.neutron [-] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1647.162463] env[67820]: INFO nova.compute.manager [-] [instance: 3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3] Took 0.04 seconds to deallocate network for instance. [ 1647.166633] env[67820]: DEBUG nova.compute.manager [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 0e0e1852-25a6-42dd-9d5a-08af14e6423a] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1647.187423] env[67820]: DEBUG nova.compute.manager [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 0e0e1852-25a6-42dd-9d5a-08af14e6423a] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1647.208378] env[67820]: DEBUG oslo_concurrency.lockutils [None req-37efd528-0eb9-44be-b4d5-fdc6b10b057e tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "0e0e1852-25a6-42dd-9d5a-08af14e6423a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.554s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1647.220658] env[67820]: DEBUG nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1647.270984] env[67820]: DEBUG oslo_concurrency.lockutils [None req-347a37e5-c7fd-4119-aafe-9d907d106ab2 tempest-InstanceActionsNegativeTestJSON-1929999534 tempest-InstanceActionsNegativeTestJSON-1929999534-project-member] Lock "3ecf2cab-b4fd-43cb-87f2-c5b89aff7da3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1647.275247] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1647.275476] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1647.276904] env[67820]: INFO nova.compute.claims [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1647.509436] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdbf55f5-ac48-495b-8323-594e61522479 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.517063] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd337c67-4ccd-4aed-862d-f54e811ba9fe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.547187] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e24f94a4-d308-431b-b4ad-c1fa491d27a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.554243] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-255543fa-95c0-4af8-8d78-6373c7d3523b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.567333] env[67820]: DEBUG nova.compute.provider_tree [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1647.577043] env[67820]: DEBUG nova.scheduler.client.report [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1647.590790] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1647.591118] env[67820]: DEBUG nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1647.626641] env[67820]: DEBUG nova.compute.utils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1647.628193] env[67820]: DEBUG nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1647.628374] env[67820]: DEBUG nova.network.neutron [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1647.637896] env[67820]: DEBUG nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1647.669284] env[67820]: INFO nova.virt.block_device [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Booting with volume 10770546-90a9-41c9-87c1-ddf570ebdba4 at /dev/sda [ 1647.689915] env[67820]: DEBUG nova.policy [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae89577a55fc477da31c234cdbbb6e56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3a16ea7f288a4f31bf44a6437aabf3ef', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1647.716403] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-42a23192-7733-427d-87be-2af468526486 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.725848] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddbbb883-8824-499d-b591-0491fb5c6b6e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.753116] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-02d2193a-5b0d-4e88-8c05-3d5bd0de7a93 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.760472] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3229c2e6-aa90-45e1-98c6-72a8547b0615 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.788755] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-565e2af3-5b36-4ffc-add3-b9475a6d4bbc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.795505] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cd1a420-0409-431e-99f7-fd983763f384 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.810448] env[67820]: DEBUG nova.virt.block_device [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Updating existing volume attachment record: 8cc98228-ff4a-4b9d-a419-9442f0f19a6c {{(pid=67820) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1648.020394] env[67820]: DEBUG nova.network.neutron [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Successfully created port: 92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1648.031766] env[67820]: DEBUG nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1648.032286] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1648.032534] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1648.032653] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1648.032826] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1648.032986] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1648.033145] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1648.033342] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1648.033488] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1648.033642] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1648.033794] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1648.033955] env[67820]: DEBUG nova.virt.hardware [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1648.035119] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3343ca8b-a03f-40be-ad40-d64baee912d7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.044218] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-638851d7-3ece-4e38-b817-926990523938 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.658839] env[67820]: DEBUG nova.network.neutron [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Successfully updated port: 92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1648.671303] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquiring lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1648.671424] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquired lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1648.671576] env[67820]: DEBUG nova.network.neutron [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1648.721492] env[67820]: DEBUG nova.network.neutron [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1648.914089] env[67820]: DEBUG nova.network.neutron [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Updating instance_info_cache with network_info: [{"id": "92a38cfe-36fa-4106-b732-63c92224a86f", "address": "fa:16:3e:7e:d8:39", "network": {"id": "333340b1-cc1d-474a-8269-2a089f8f3296", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1756869440-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3a16ea7f288a4f31bf44a6437aabf3ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1e7173e-4163-4212-9339-aea3eddd359e", "external-id": "nsx-vlan-transportzone-525", "segmentation_id": 525, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92a38cfe-36", "ovs_interfaceid": "92a38cfe-36fa-4106-b732-63c92224a86f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1648.929117] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Releasing lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1648.929410] env[67820]: DEBUG nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance network_info: |[{"id": "92a38cfe-36fa-4106-b732-63c92224a86f", "address": "fa:16:3e:7e:d8:39", "network": {"id": "333340b1-cc1d-474a-8269-2a089f8f3296", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1756869440-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3a16ea7f288a4f31bf44a6437aabf3ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1e7173e-4163-4212-9339-aea3eddd359e", "external-id": "nsx-vlan-transportzone-525", "segmentation_id": 525, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92a38cfe-36", "ovs_interfaceid": "92a38cfe-36fa-4106-b732-63c92224a86f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1648.929794] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7e:d8:39', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c1e7173e-4163-4212-9339-aea3eddd359e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '92a38cfe-36fa-4106-b732-63c92224a86f', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1648.937141] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Creating folder: Project (3a16ea7f288a4f31bf44a6437aabf3ef). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1648.937655] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-19fb5b44-aeff-45f5-9da1-333a678ff796 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.951194] env[67820]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1648.951345] env[67820]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=67820) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 1648.951623] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Folder already exists: Project (3a16ea7f288a4f31bf44a6437aabf3ef). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1648.951806] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Creating folder: Instances. Parent ref: group-v692746. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1648.952043] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1ea8dc2f-dd61-451a-969c-34d5c0bcb963 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.960392] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Created folder: Instances in parent group-v692746. [ 1648.960597] env[67820]: DEBUG oslo.service.loopingcall [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1648.960769] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1648.960948] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d1e60237-347f-43bc-a5cc-2a00fc2aec16 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1648.979187] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1648.979187] env[67820]: value = "task-3467440" [ 1648.979187] env[67820]: _type = "Task" [ 1648.979187] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1648.986360] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467440, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1649.021446] env[67820]: DEBUG nova.compute.manager [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Received event network-vif-plugged-92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1649.021675] env[67820]: DEBUG oslo_concurrency.lockutils [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] Acquiring lock "a84c5537-9ad1-44d6-b732-fda1156bff86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1649.021889] env[67820]: DEBUG oslo_concurrency.lockutils [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1649.022066] env[67820]: DEBUG oslo_concurrency.lockutils [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1649.022234] env[67820]: DEBUG nova.compute.manager [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] No waiting events found dispatching network-vif-plugged-92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1649.022391] env[67820]: WARNING nova.compute.manager [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Received unexpected event network-vif-plugged-92a38cfe-36fa-4106-b732-63c92224a86f for instance with vm_state building and task_state spawning. [ 1649.022543] env[67820]: DEBUG nova.compute.manager [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Received event network-changed-92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1649.022690] env[67820]: DEBUG nova.compute.manager [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Refreshing instance network info cache due to event network-changed-92a38cfe-36fa-4106-b732-63c92224a86f. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1649.022866] env[67820]: DEBUG oslo_concurrency.lockutils [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] Acquiring lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1649.023036] env[67820]: DEBUG oslo_concurrency.lockutils [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] Acquired lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1649.023225] env[67820]: DEBUG nova.network.neutron [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Refreshing network info cache for port 92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1649.363862] env[67820]: DEBUG nova.network.neutron [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Updated VIF entry in instance network info cache for port 92a38cfe-36fa-4106-b732-63c92224a86f. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1649.364285] env[67820]: DEBUG nova.network.neutron [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Updating instance_info_cache with network_info: [{"id": "92a38cfe-36fa-4106-b732-63c92224a86f", "address": "fa:16:3e:7e:d8:39", "network": {"id": "333340b1-cc1d-474a-8269-2a089f8f3296", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1756869440-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3a16ea7f288a4f31bf44a6437aabf3ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1e7173e-4163-4212-9339-aea3eddd359e", "external-id": "nsx-vlan-transportzone-525", "segmentation_id": 525, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92a38cfe-36", "ovs_interfaceid": "92a38cfe-36fa-4106-b732-63c92224a86f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1649.373870] env[67820]: DEBUG oslo_concurrency.lockutils [req-dd2e6ce6-b193-4cdd-a836-68deb267661f req-d901081d-1c6e-45ab-b3cf-990272298085 service nova] Releasing lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1649.489068] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467440, 'name': CreateVM_Task, 'duration_secs': 0.285602} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1649.489242] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1649.489852] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'mount_device': '/dev/sda', 'disk_bus': None, 'guest_format': None, 'delete_on_termination': True, 'attachment_id': '8cc98228-ff4a-4b9d-a419-9442f0f19a6c', 'boot_index': 0, 'device_type': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692749', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'name': 'volume-10770546-90a9-41c9-87c1-ddf570ebdba4', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'a84c5537-9ad1-44d6-b732-fda1156bff86', 'attached_at': '', 'detached_at': '', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'serial': '10770546-90a9-41c9-87c1-ddf570ebdba4'}, 'volume_type': None}], 'swap': None} {{(pid=67820) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1649.490167] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Root volume attach. Driver type: vmdk {{(pid=67820) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1649.490928] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96ad9451-e454-460c-a78c-b39e343ecd27 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.499924] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4383f25-5619-45ad-a686-922126fd1468 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.505617] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f09d20c-17a9-4198-bd6b-ce5bacdd9bbf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.512550] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-0173c3d5-0ac6-4aef-b9a4-3b6ed4a93c63 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1649.518637] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1649.518637] env[67820]: value = "task-3467441" [ 1649.518637] env[67820]: _type = "Task" [ 1649.518637] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1649.531765] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467441, 'name': RelocateVM_Task} progress is 5%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1650.028061] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467441, 'name': RelocateVM_Task, 'duration_secs': 0.345211} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1650.028372] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Volume attach. Driver type: vmdk {{(pid=67820) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1650.028573] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692749', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'name': 'volume-10770546-90a9-41c9-87c1-ddf570ebdba4', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'a84c5537-9ad1-44d6-b732-fda1156bff86', 'attached_at': '', 'detached_at': '', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'serial': '10770546-90a9-41c9-87c1-ddf570ebdba4'} {{(pid=67820) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1650.029322] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-861a3c22-307b-4b16-aa4d-b7a695626e9d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1650.045292] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11f9ee90-1e2d-4226-b33a-3ebbe61b5695 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1650.067418] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Reconfiguring VM instance instance-0000004d to attach disk [datastore1] volume-10770546-90a9-41c9-87c1-ddf570ebdba4/volume-10770546-90a9-41c9-87c1-ddf570ebdba4.vmdk or device None with type thin {{(pid=67820) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1650.067661] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-8f05ef05-5a50-4045-a688-3b4adc80931b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1650.086920] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1650.086920] env[67820]: value = "task-3467442" [ 1650.086920] env[67820]: _type = "Task" [ 1650.086920] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1650.094252] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467442, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1650.596563] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467442, 'name': ReconfigVM_Task, 'duration_secs': 0.233843} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1650.596845] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Reconfigured VM instance instance-0000004d to attach disk [datastore1] volume-10770546-90a9-41c9-87c1-ddf570ebdba4/volume-10770546-90a9-41c9-87c1-ddf570ebdba4.vmdk or device None with type thin {{(pid=67820) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1650.601713] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-f1946251-8173-44fd-b4c1-3642d0b23a44 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1650.616973] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1650.616973] env[67820]: value = "task-3467443" [ 1650.616973] env[67820]: _type = "Task" [ 1650.616973] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1650.624913] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467443, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1651.126674] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467443, 'name': ReconfigVM_Task, 'duration_secs': 0.112683} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1651.127015] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692749', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'name': 'volume-10770546-90a9-41c9-87c1-ddf570ebdba4', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'a84c5537-9ad1-44d6-b732-fda1156bff86', 'attached_at': '', 'detached_at': '', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'serial': '10770546-90a9-41c9-87c1-ddf570ebdba4'} {{(pid=67820) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1651.127625] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-e024187c-e6ac-4c4d-9a62-48388da2e0e1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.133950] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1651.133950] env[67820]: value = "task-3467444" [ 1651.133950] env[67820]: _type = "Task" [ 1651.133950] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1651.141150] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467444, 'name': Rename_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1651.644628] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467444, 'name': Rename_Task, 'duration_secs': 0.119408} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1651.644898] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Powering on the VM {{(pid=67820) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1651.645200] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-19afa837-9f3e-489b-a515-00d454059e46 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1651.651713] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1651.651713] env[67820]: value = "task-3467445" [ 1651.651713] env[67820]: _type = "Task" [ 1651.651713] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1651.660156] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467445, 'name': PowerOnVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1652.162035] env[67820]: DEBUG oslo_vmware.api [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467445, 'name': PowerOnVM_Task, 'duration_secs': 0.420298} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1652.162035] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Powered on the VM {{(pid=67820) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1652.162035] env[67820]: INFO nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Took 4.13 seconds to spawn the instance on the hypervisor. [ 1652.162429] env[67820]: DEBUG nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Checking state {{(pid=67820) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1652.162810] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e42d88a-3332-4d51-869b-db4fe1072bd5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.214269] env[67820]: INFO nova.compute.manager [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Took 4.95 seconds to build instance. [ 1652.228423] env[67820]: DEBUG oslo_concurrency.lockutils [None req-708554b5-c922-4121-ac68-2c34d76ef674 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 166.537s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1652.242737] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1652.291726] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1652.292044] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.293945] env[67820]: INFO nova.compute.claims [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1652.514896] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-401174c1-cae5-4cf8-a745-c74a49ef034a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.523088] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33fbe0e-2e03-454f-9b24-1f2fda4083ea {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.553432] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd93e0d-a560-4ddc-ad21-2ecfae0f7f31 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.560814] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6294b3c-e000-46c9-907c-7306c08b415e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.575147] env[67820]: DEBUG nova.compute.provider_tree [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1652.583676] env[67820]: DEBUG nova.scheduler.client.report [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1652.597615] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1652.598112] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1652.631807] env[67820]: DEBUG nova.compute.utils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1652.633615] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1652.633840] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1652.643982] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1652.710187] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1652.736793] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1652.737112] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1652.737258] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1652.737463] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1652.737612] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1652.737758] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1652.737968] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1652.738185] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1652.738336] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1652.738509] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1652.738695] env[67820]: DEBUG nova.virt.hardware [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1652.739571] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aa62a3c-1bf6-41f9-b3da-9b5598f09369 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1652.743923] env[67820]: DEBUG nova.policy [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd81a07fc3b4c470c8f5a087d8825c5df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '890ffca423414cd69eca6a6bf4d1ac66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1652.751303] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d2081e0-9875-4474-8fc0-d4654519acc7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1653.067910] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Successfully created port: c07e653d-b10e-4e9c-8159-943cd3461fc8 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1653.952114] env[67820]: DEBUG nova.compute.manager [req-351c464f-9cbe-4175-992f-8b42c1e1cfef req-34322ba5-51db-4e1d-a465-9800c0dd3b70 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Received event network-vif-plugged-c07e653d-b10e-4e9c-8159-943cd3461fc8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1653.952336] env[67820]: DEBUG oslo_concurrency.lockutils [req-351c464f-9cbe-4175-992f-8b42c1e1cfef req-34322ba5-51db-4e1d-a465-9800c0dd3b70 service nova] Acquiring lock "276123c5-3edc-4e33-9b13-baae0fc9de9f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1653.952507] env[67820]: DEBUG oslo_concurrency.lockutils [req-351c464f-9cbe-4175-992f-8b42c1e1cfef req-34322ba5-51db-4e1d-a465-9800c0dd3b70 service nova] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1653.952660] env[67820]: DEBUG oslo_concurrency.lockutils [req-351c464f-9cbe-4175-992f-8b42c1e1cfef req-34322ba5-51db-4e1d-a465-9800c0dd3b70 service nova] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1653.953139] env[67820]: DEBUG nova.compute.manager [req-351c464f-9cbe-4175-992f-8b42c1e1cfef req-34322ba5-51db-4e1d-a465-9800c0dd3b70 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] No waiting events found dispatching network-vif-plugged-c07e653d-b10e-4e9c-8159-943cd3461fc8 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1653.953139] env[67820]: WARNING nova.compute.manager [req-351c464f-9cbe-4175-992f-8b42c1e1cfef req-34322ba5-51db-4e1d-a465-9800c0dd3b70 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Received unexpected event network-vif-plugged-c07e653d-b10e-4e9c-8159-943cd3461fc8 for instance with vm_state building and task_state spawning. [ 1653.957100] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Successfully updated port: c07e653d-b10e-4e9c-8159-943cd3461fc8 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1653.967475] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "refresh_cache-276123c5-3edc-4e33-9b13-baae0fc9de9f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1653.967631] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "refresh_cache-276123c5-3edc-4e33-9b13-baae0fc9de9f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1653.967780] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1654.026722] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1654.229257] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Updating instance_info_cache with network_info: [{"id": "c07e653d-b10e-4e9c-8159-943cd3461fc8", "address": "fa:16:3e:30:82:b9", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc07e653d-b1", "ovs_interfaceid": "c07e653d-b10e-4e9c-8159-943cd3461fc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1654.246360] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "refresh_cache-276123c5-3edc-4e33-9b13-baae0fc9de9f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1654.246676] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Instance network_info: |[{"id": "c07e653d-b10e-4e9c-8159-943cd3461fc8", "address": "fa:16:3e:30:82:b9", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc07e653d-b1", "ovs_interfaceid": "c07e653d-b10e-4e9c-8159-943cd3461fc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1654.247081] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:30:82:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db00ec2e-3155-46b6-8170-082f7d86dbe7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c07e653d-b10e-4e9c-8159-943cd3461fc8', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1654.255231] env[67820]: DEBUG oslo.service.loopingcall [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1654.255791] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1654.256105] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-50aca571-a977-40be-ad1a-63794d19c7c6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.277896] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1654.277896] env[67820]: value = "task-3467446" [ 1654.277896] env[67820]: _type = "Task" [ 1654.277896] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1654.288299] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467446, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1654.792026] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467446, 'name': CreateVM_Task, 'duration_secs': 0.273496} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1654.792026] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1654.792026] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1654.792026] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1654.792026] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1654.792026] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4220d907-8d72-483e-8880-54024ea3a629 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1654.794929] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1654.794929] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52e75638-f025-96d4-c376-f26392061e94" [ 1654.794929] env[67820]: _type = "Task" [ 1654.794929] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1654.803073] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52e75638-f025-96d4-c376-f26392061e94, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1654.879938] env[67820]: DEBUG nova.compute.manager [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Received event network-changed-92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1654.880210] env[67820]: DEBUG nova.compute.manager [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Refreshing instance network info cache due to event network-changed-92a38cfe-36fa-4106-b732-63c92224a86f. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1654.880407] env[67820]: DEBUG oslo_concurrency.lockutils [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] Acquiring lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1654.880595] env[67820]: DEBUG oslo_concurrency.lockutils [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] Acquired lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1654.880709] env[67820]: DEBUG nova.network.neutron [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Refreshing network info cache for port 92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1655.311164] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1655.311477] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1655.311734] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1655.446466] env[67820]: DEBUG nova.network.neutron [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Updated VIF entry in instance network info cache for port 92a38cfe-36fa-4106-b732-63c92224a86f. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1655.446679] env[67820]: DEBUG nova.network.neutron [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Updating instance_info_cache with network_info: [{"id": "92a38cfe-36fa-4106-b732-63c92224a86f", "address": "fa:16:3e:7e:d8:39", "network": {"id": "333340b1-cc1d-474a-8269-2a089f8f3296", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1756869440-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.147", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3a16ea7f288a4f31bf44a6437aabf3ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1e7173e-4163-4212-9339-aea3eddd359e", "external-id": "nsx-vlan-transportzone-525", "segmentation_id": 525, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92a38cfe-36", "ovs_interfaceid": "92a38cfe-36fa-4106-b732-63c92224a86f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1655.458538] env[67820]: DEBUG oslo_concurrency.lockutils [req-22d2fa63-4de8-4f66-9e4e-6e16b16e91bf req-30f3e2ce-fd82-4405-a791-0dc8361ee56d service nova] Releasing lock "refresh_cache-a84c5537-9ad1-44d6-b732-fda1156bff86" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1655.984380] env[67820]: DEBUG nova.compute.manager [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Received event network-changed-c07e653d-b10e-4e9c-8159-943cd3461fc8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1655.984666] env[67820]: DEBUG nova.compute.manager [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Refreshing instance network info cache due to event network-changed-c07e653d-b10e-4e9c-8159-943cd3461fc8. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1655.984962] env[67820]: DEBUG oslo_concurrency.lockutils [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] Acquiring lock "refresh_cache-276123c5-3edc-4e33-9b13-baae0fc9de9f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1655.985335] env[67820]: DEBUG oslo_concurrency.lockutils [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] Acquired lock "refresh_cache-276123c5-3edc-4e33-9b13-baae0fc9de9f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1655.985575] env[67820]: DEBUG nova.network.neutron [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Refreshing network info cache for port c07e653d-b10e-4e9c-8159-943cd3461fc8 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1656.308958] env[67820]: DEBUG nova.network.neutron [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Updated VIF entry in instance network info cache for port c07e653d-b10e-4e9c-8159-943cd3461fc8. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1656.309368] env[67820]: DEBUG nova.network.neutron [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Updating instance_info_cache with network_info: [{"id": "c07e653d-b10e-4e9c-8159-943cd3461fc8", "address": "fa:16:3e:30:82:b9", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc07e653d-b1", "ovs_interfaceid": "c07e653d-b10e-4e9c-8159-943cd3461fc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1656.322668] env[67820]: DEBUG oslo_concurrency.lockutils [req-a427a0c6-09d7-4e86-9f2b-41e4ad989ad4 req-3cc0ae70-0b7f-4d54-9d6a-6610d7df87d3 service nova] Releasing lock "refresh_cache-276123c5-3edc-4e33-9b13-baae0fc9de9f" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1660.253863] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "bcb239dd-e793-43be-9f94-e53eb50e2f49" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1660.254183] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1670.622887] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1670.806474] env[67820]: INFO nova.compute.manager [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Rebuilding instance [ 1670.847515] env[67820]: DEBUG nova.compute.manager [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Checking state {{(pid=67820) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1670.848401] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0722888c-5d0c-40b7-8ac5-f1b7da0c1e73 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1670.886459] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Powering off the VM {{(pid=67820) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1670.886977] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-d1448d7e-1095-4acb-8ed6-93f8e7ed3838 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1670.894107] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1670.894107] env[67820]: value = "task-3467447" [ 1670.894107] env[67820]: _type = "Task" [ 1670.894107] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1670.903600] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467447, 'name': PowerOffVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1671.404306] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467447, 'name': PowerOffVM_Task, 'duration_secs': 0.171792} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1671.404509] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Powered off the VM {{(pid=67820) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1671.405217] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Powering off the VM {{(pid=67820) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1671.405461] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-b54eaff2-d5b8-4958-b154-95a7e550cf69 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1671.411216] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1671.411216] env[67820]: value = "task-3467448" [ 1671.411216] env[67820]: _type = "Task" [ 1671.411216] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1671.418369] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467448, 'name': PowerOffVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1671.921649] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] VM already powered off {{(pid=67820) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 1671.921906] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Volume detach. Driver type: vmdk {{(pid=67820) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1671.922072] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692749', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'name': 'volume-10770546-90a9-41c9-87c1-ddf570ebdba4', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'a84c5537-9ad1-44d6-b732-fda1156bff86', 'attached_at': '', 'detached_at': '', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'serial': '10770546-90a9-41c9-87c1-ddf570ebdba4'} {{(pid=67820) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1671.922840] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9399728-f5fe-445c-8354-da75b1c9314a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1671.940739] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36ca4e1c-5ec9-44b8-8de2-af136fa16295 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1671.946929] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aab4cfd-784a-4488-9e6c-0c023ff86fa9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1671.963739] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dd32fea-5ae1-47e6-bef1-945c854879f5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1671.977607] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] The volume has not been displaced from its original location: [datastore1] volume-10770546-90a9-41c9-87c1-ddf570ebdba4/volume-10770546-90a9-41c9-87c1-ddf570ebdba4.vmdk. No consolidation needed. {{(pid=67820) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1671.982803] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Reconfiguring VM instance instance-0000004d to detach disk 2000 {{(pid=67820) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1671.983110] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-cc9c9c44-7278-42c2-8f12-6dc0deb0e01e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1672.000018] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1672.000018] env[67820]: value = "task-3467449" [ 1672.000018] env[67820]: _type = "Task" [ 1672.000018] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1672.007191] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467449, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1672.509351] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467449, 'name': ReconfigVM_Task} progress is 99%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1672.621118] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1672.621310] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1673.010522] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467449, 'name': ReconfigVM_Task, 'duration_secs': 0.512208} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1673.010823] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Reconfigured VM instance instance-0000004d to detach disk 2000 {{(pid=67820) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1673.015844] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-604ea7c2-bd82-476b-87df-e016936bfbe6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.030237] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1673.030237] env[67820]: value = "task-3467450" [ 1673.030237] env[67820]: _type = "Task" [ 1673.030237] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1673.039320] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467450, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1673.539893] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467450, 'name': ReconfigVM_Task, 'duration_secs': 0.100047} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1673.540265] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692749', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'name': 'volume-10770546-90a9-41c9-87c1-ddf570ebdba4', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'a84c5537-9ad1-44d6-b732-fda1156bff86', 'attached_at': '', 'detached_at': '', 'volume_id': '10770546-90a9-41c9-87c1-ddf570ebdba4', 'serial': '10770546-90a9-41c9-87c1-ddf570ebdba4'} {{(pid=67820) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1673.540612] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1673.541407] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3edf539f-aab8-4c11-9ff2-7614ab2d5ed5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.547600] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1673.547881] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d8082bd0-93f7-4b8d-96f9-c297c314f442 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.603514] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1673.603828] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1673.604074] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Deleting the datastore file [datastore1] a84c5537-9ad1-44d6-b732-fda1156bff86 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1673.604371] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8f07104b-1083-4eae-a45c-6c55d56ba728 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.609873] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for the task: (returnval){ [ 1673.609873] env[67820]: value = "task-3467452" [ 1673.609873] env[67820]: _type = "Task" [ 1673.609873] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1673.617044] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467452, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1674.119674] env[67820]: DEBUG oslo_vmware.api [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Task: {'id': task-3467452, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073219} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1674.119952] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1674.120248] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1674.120427] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1674.172535] env[67820]: DEBUG nova.virt.vmwareapi.volumeops [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Volume detach. Driver type: vmdk {{(pid=67820) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1674.172866] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-efb6ffaf-a356-4173-ae78-4e0d2bdb6316 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.181586] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcefa50e-fe02-4d9d-84ea-a90eaad59c6f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.210806] env[67820]: ERROR nova.compute.manager [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Failed to detach volume 10770546-90a9-41c9-87c1-ddf570ebdba4 from /dev/sda: nova.exception.InstanceNotFound: Instance a84c5537-9ad1-44d6-b732-fda1156bff86 could not be found. [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Traceback (most recent call last): [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 4116, in _do_rebuild_instance [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self.driver.rebuild(**kwargs) [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/driver.py", line 384, in rebuild [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] raise NotImplementedError() [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] NotImplementedError [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] During handling of the above exception, another exception occurred: [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Traceback (most recent call last): [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3539, in _detach_root_volume [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self.driver.detach_volume(context, old_connection_info, [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 552, in detach_volume [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] return self._volumeops.detach_volume(connection_info, instance) [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self._detach_volume_vmdk(connection_info, instance) [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] stable_ref.fetch_moref(session) [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] nova.exception.InstanceNotFound: Instance a84c5537-9ad1-44d6-b732-fda1156bff86 could not be found. [ 1674.210806] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.340337] env[67820]: DEBUG nova.compute.utils [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Build of instance a84c5537-9ad1-44d6-b732-fda1156bff86 aborted: Failed to rebuild volume backed instance. {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1674.342876] env[67820]: ERROR nova.compute.manager [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance a84c5537-9ad1-44d6-b732-fda1156bff86 aborted: Failed to rebuild volume backed instance. [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Traceback (most recent call last): [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 4116, in _do_rebuild_instance [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self.driver.rebuild(**kwargs) [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/driver.py", line 384, in rebuild [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] raise NotImplementedError() [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] NotImplementedError [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] During handling of the above exception, another exception occurred: [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Traceback (most recent call last): [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3574, in _rebuild_volume_backed_instance [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self._detach_root_volume(context, instance, root_bdm) [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3553, in _detach_root_volume [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] with excutils.save_and_reraise_exception(): [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self.force_reraise() [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] raise self.value [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3539, in _detach_root_volume [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self.driver.detach_volume(context, old_connection_info, [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 552, in detach_volume [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] return self._volumeops.detach_volume(connection_info, instance) [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self._detach_volume_vmdk(connection_info, instance) [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] stable_ref.fetch_moref(session) [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] nova.exception.InstanceNotFound: Instance a84c5537-9ad1-44d6-b732-fda1156bff86 could not be found. [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] During handling of the above exception, another exception occurred: [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Traceback (most recent call last): [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 10835, in _error_out_instance_on_exception [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] yield [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3842, in rebuild_instance [ 1674.342876] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self._do_rebuild_instance_with_claim( [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3928, in _do_rebuild_instance_with_claim [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self._do_rebuild_instance( [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 4120, in _do_rebuild_instance [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self._rebuild_default_impl(**kwargs) [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3697, in _rebuild_default_impl [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] self._rebuild_volume_backed_instance( [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] File "/opt/stack/nova/nova/compute/manager.py", line 3589, in _rebuild_volume_backed_instance [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] raise exception.BuildAbortException( [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] nova.exception.BuildAbortException: Build of instance a84c5537-9ad1-44d6-b732-fda1156bff86 aborted: Failed to rebuild volume backed instance. [ 1674.344076] env[67820]: ERROR nova.compute.manager [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] [ 1674.445937] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1674.445937] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1674.616578] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1674.637521] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49dba5df-745d-41d8-a5a4-0f89243d305d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.647084] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-362babd3-0c85-4c2a-876c-bf80a08ac3c7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.681663] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9394c3c5-7abf-4c16-8742-70db2b5e46e1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.684318] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquiring lock "a84c5537-9ad1-44d6-b732-fda1156bff86" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1674.684545] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1674.684739] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquiring lock "a84c5537-9ad1-44d6-b732-fda1156bff86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1674.684915] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1674.685090] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1674.687353] env[67820]: INFO nova.compute.manager [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Terminating instance [ 1674.691695] env[67820]: DEBUG nova.compute.manager [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1674.692838] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9540a68a-ee18-49e7-b77d-47a508449f75 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.696548] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-56f04b60-a15c-4bcf-9746-12c0acfb4fec {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.708238] env[67820]: DEBUG nova.compute.provider_tree [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1674.713083] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84e227b4-ee25-44f1-b687-fd05891972f8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.724030] env[67820]: DEBUG nova.scheduler.client.report [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1674.745611] env[67820]: WARNING nova.virt.vmwareapi.driver [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance a84c5537-9ad1-44d6-b732-fda1156bff86 could not be found. [ 1674.745823] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1674.746569] env[67820]: DEBUG oslo_concurrency.lockutils [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.301s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1674.746750] env[67820]: INFO nova.compute.manager [None req-db3f1630-8734-4d80-b729-03dc8db765e0 tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Successfully reverted task state from rebuilding on failure for instance. [ 1674.751766] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cef92f58-7322-485b-8c46-7b8592f95b87 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.760237] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7e556d1-31c2-452e-97e6-47aaa02f98e5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.790164] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a84c5537-9ad1-44d6-b732-fda1156bff86 could not be found. [ 1674.790364] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1674.790538] env[67820]: INFO nova.compute.manager [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Took 0.10 seconds to destroy the instance on the hypervisor. [ 1674.790771] env[67820]: DEBUG oslo.service.loopingcall [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1674.790988] env[67820]: DEBUG nova.compute.manager [-] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1674.791097] env[67820]: DEBUG nova.network.neutron [-] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1675.624018] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.624018] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.624018] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.632294] env[67820]: DEBUG nova.network.neutron [-] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1675.640819] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1675.640819] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1675.640988] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1675.641119] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1675.643198] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ec26c37-1e13-4f90-bbe1-25e2026b3134 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.648109] env[67820]: INFO nova.compute.manager [-] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Took 0.86 seconds to deallocate network for instance. [ 1675.654161] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55fe5cd2-77b5-4a50-bd4b-505d8e926ce6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.673065] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f6d34d9-9c07-42b3-9021-3c4db04958b1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.673927] env[67820]: DEBUG nova.compute.manager [req-96e158c1-8e66-477d-84eb-53755376cd45 req-b5d8dd44-a146-466f-ad25-ff6ec370edc4 service nova] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Received event network-vif-deleted-92a38cfe-36fa-4106-b732-63c92224a86f {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1675.679267] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceb84ef9-121d-4855-91da-064398f1b373 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.710820] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180941MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1675.710820] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1675.711061] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1675.738755] env[67820]: INFO nova.compute.manager [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Took 0.09 seconds to detach 1 volumes for instance. [ 1675.740858] env[67820]: DEBUG nova.compute.manager [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Deleting volume: 10770546-90a9-41c9-87c1-ddf570ebdba4 {{(pid=67820) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3221}} [ 1675.802606] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.802894] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.803445] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.803445] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.803445] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.803792] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.803792] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.803894] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.803992] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.804086] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.827819] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1675.840680] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.859588] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.871734] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.884736] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.899538] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 813af34e-49cc-40a9-a0d0-388a84fde493 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.914176] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.914416] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 11 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1675.914640] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1920MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=11 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1676.173633] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa3059a-880b-46a4-9d73-f01246b0c1ae {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.181787] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eb5efaf-652c-4997-9157-bedebbf47eae {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.214945] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23d8de5f-2092-4514-9fb5-704f3d308f9e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.222432] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b5de464-49c7-48be-bcb6-637408ed29a0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.237180] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1676.246147] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1676.261992] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1676.262224] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.551s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1676.262494] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.435s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1676.262735] env[67820]: DEBUG nova.objects.instance [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lazy-loading 'resources' on Instance uuid a84c5537-9ad1-44d6-b732-fda1156bff86 {{(pid=67820) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 1676.504057] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6c9b98b-dc66-48c5-bfbb-4db4b68220e1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.511546] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96aac6b9-8d81-4029-86ef-c319c7627fa4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.541434] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f46b69b-9388-49ad-98a1-625bd09f22bf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.548687] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16cc5aef-05e9-44b7-af04-eda39ab75c09 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.561832] env[67820]: DEBUG nova.compute.provider_tree [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1676.572985] env[67820]: DEBUG nova.scheduler.client.report [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1676.588614] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.326s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1676.662814] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52c72fa6-add0-4bbf-8e7f-15fae905d2bc tempest-ServerActionsV293TestJSON-1449060620 tempest-ServerActionsV293TestJSON-1449060620-project-member] Lock "a84c5537-9ad1-44d6-b732-fda1156bff86" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.978s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.264475] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1677.264654] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1677.264773] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1677.287254] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.287439] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.287600] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.287732] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.287853] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.287973] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.288106] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.288226] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.288346] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.288463] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1677.288580] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1677.621100] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1677.621394] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1695.007596] env[67820]: WARNING oslo_vmware.rw_handles [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1695.007596] env[67820]: ERROR oslo_vmware.rw_handles [ 1695.008237] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1695.009965] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1695.010240] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Copying Virtual Disk [datastore1] vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/c948cd00-3e66-42c4-85d9-80fde1a45fad/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1695.010563] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fa9996de-6ec5-43b2-8f1b-77861eddfa9f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.017924] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Waiting for the task: (returnval){ [ 1695.017924] env[67820]: value = "task-3467454" [ 1695.017924] env[67820]: _type = "Task" [ 1695.017924] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1695.025819] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Task: {'id': task-3467454, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1695.528045] env[67820]: DEBUG oslo_vmware.exceptions [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1695.528343] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1695.528884] env[67820]: ERROR nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1695.528884] env[67820]: Faults: ['InvalidArgument'] [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Traceback (most recent call last): [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] yield resources [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self.driver.spawn(context, instance, image_meta, [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self._fetch_image_if_missing(context, vi) [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] image_cache(vi, tmp_image_ds_loc) [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] vm_util.copy_virtual_disk( [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] session._wait_for_task(vmdk_copy_task) [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] return self.wait_for_task(task_ref) [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] return evt.wait() [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] result = hub.switch() [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] return self.greenlet.switch() [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self.f(*self.args, **self.kw) [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] raise exceptions.translate_fault(task_info.error) [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Faults: ['InvalidArgument'] [ 1695.528884] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] [ 1695.529756] env[67820]: INFO nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Terminating instance [ 1695.530704] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1695.530908] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1695.531157] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0691c71a-1ac9-4dd0-bf29-56324531b5fd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.533929] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1695.534139] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1695.534866] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53b3b56b-addd-4f15-81b0-bf771e32ecfc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.541675] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1695.541888] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cef78d42-8f5f-43ad-8310-13ef66cdd1e2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.544085] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1695.544263] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1695.545206] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b130eb3e-5d78-4835-b47a-c189ceb3e642 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.551052] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1695.551052] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]526e663e-9af1-0efd-3d26-387c6eb67120" [ 1695.551052] env[67820]: _type = "Task" [ 1695.551052] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1695.557603] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]526e663e-9af1-0efd-3d26-387c6eb67120, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1695.615222] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1695.615496] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1695.615680] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Deleting the datastore file [datastore1] f4d41e35-6408-4cd8-a7e0-52b030e56b40 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1695.615944] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ce38980d-dcfb-4b88-b6fc-50f45ee0a90d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1695.621458] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Waiting for the task: (returnval){ [ 1695.621458] env[67820]: value = "task-3467456" [ 1695.621458] env[67820]: _type = "Task" [ 1695.621458] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1695.628937] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Task: {'id': task-3467456, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1695.782437] env[67820]: DEBUG oslo_concurrency.lockutils [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1696.059959] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1696.060246] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1696.060478] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11ce64ea-fe1c-482f-8ff2-739112be6ecd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.071857] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1696.072057] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Fetch image to [datastore1] vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1696.072229] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1696.072925] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a1f945d-f89b-411d-9f1c-65bf0e828dca {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.078978] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2296c30-aa97-45e7-bb29-f4be81fd0a9f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.087589] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74ce61f9-eb51-4729-b285-5ea9d6f88c32 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.118337] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8e7089f-7a72-4b75-993c-24a2238450b8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.126208] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-808d76f5-2977-45e1-a65e-3b7e1ad5576c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.130378] env[67820]: DEBUG oslo_vmware.api [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Task: {'id': task-3467456, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.088054} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1696.130884] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1696.131080] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1696.131256] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1696.131428] env[67820]: INFO nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1696.133445] env[67820]: DEBUG nova.compute.claims [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1696.133614] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1696.133826] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1696.151429] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1696.205962] env[67820]: DEBUG oslo_vmware.rw_handles [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1696.264681] env[67820]: DEBUG oslo_vmware.rw_handles [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1696.265020] env[67820]: DEBUG oslo_vmware.rw_handles [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1696.421669] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b9ef411-80b9-4c21-ab6b-d3514cb868ef {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.429244] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4b833ce-4670-48d8-b7ef-ceac4d90d6a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.457734] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ea91ff-1baf-44d2-a349-13f59ff2e85c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.464073] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6df9f57a-d2a2-4361-98bc-3ecc8dd661ca {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1696.477852] env[67820]: DEBUG nova.compute.provider_tree [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1696.488189] env[67820]: DEBUG nova.scheduler.client.report [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1696.501604] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.368s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1696.502118] env[67820]: ERROR nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.502118] env[67820]: Faults: ['InvalidArgument'] [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Traceback (most recent call last): [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self.driver.spawn(context, instance, image_meta, [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self._fetch_image_if_missing(context, vi) [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] image_cache(vi, tmp_image_ds_loc) [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] vm_util.copy_virtual_disk( [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] session._wait_for_task(vmdk_copy_task) [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] return self.wait_for_task(task_ref) [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] return evt.wait() [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] result = hub.switch() [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] return self.greenlet.switch() [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] self.f(*self.args, **self.kw) [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] raise exceptions.translate_fault(task_info.error) [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Faults: ['InvalidArgument'] [ 1696.502118] env[67820]: ERROR nova.compute.manager [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] [ 1696.502859] env[67820]: DEBUG nova.compute.utils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1696.504246] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Build of instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 was re-scheduled: A specified parameter was not correct: fileType [ 1696.504246] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1696.504621] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1696.504793] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1696.504959] env[67820]: DEBUG nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1696.505132] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1697.188908] env[67820]: DEBUG nova.network.neutron [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1697.206535] env[67820]: INFO nova.compute.manager [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Took 0.70 seconds to deallocate network for instance. [ 1697.298393] env[67820]: INFO nova.scheduler.client.report [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Deleted allocations for instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 [ 1697.321128] env[67820]: DEBUG oslo_concurrency.lockutils [None req-58b8e26d-3b79-44c9-87cb-280aac9eebcb tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 628.456s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1697.322433] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 432.640s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1697.322681] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Acquiring lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1697.323112] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1697.323261] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1697.325661] env[67820]: INFO nova.compute.manager [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Terminating instance [ 1697.328138] env[67820]: DEBUG nova.compute.manager [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1697.328138] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1697.328450] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-91d5fe8f-61a1-4d8a-b117-a95a33b29e1c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.337942] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44bb076f-21e4-41ae-85f0-4d28efcfde84 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.350133] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1697.370649] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f4d41e35-6408-4cd8-a7e0-52b030e56b40 could not be found. [ 1697.371283] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1697.371283] env[67820]: INFO nova.compute.manager [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1697.371283] env[67820]: DEBUG oslo.service.loopingcall [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1697.371508] env[67820]: DEBUG nova.compute.manager [-] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1697.371638] env[67820]: DEBUG nova.network.neutron [-] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1697.398105] env[67820]: DEBUG nova.network.neutron [-] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1697.401780] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1697.402015] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1697.403403] env[67820]: INFO nova.compute.claims [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1697.406417] env[67820]: INFO nova.compute.manager [-] [instance: f4d41e35-6408-4cd8-a7e0-52b030e56b40] Took 0.03 seconds to deallocate network for instance. [ 1697.502090] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5d135c02-a70a-4973-8295-ef7b0bc81bce tempest-AttachInterfacesV270Test-1691333145 tempest-AttachInterfacesV270Test-1691333145-project-member] Lock "f4d41e35-6408-4cd8-a7e0-52b030e56b40" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1697.630511] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e4ecc58-9973-4286-b4a2-ddcb5b64c226 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.638575] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78fe17c3-d5e2-48b9-a3e3-ebbcb517382b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.668021] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b52f931c-1d9f-46d2-bf86-75b7058bae58 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.674956] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e878aca-3a96-4a94-902b-412b7d924b1b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.687799] env[67820]: DEBUG nova.compute.provider_tree [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1697.696234] env[67820]: DEBUG nova.scheduler.client.report [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1697.711205] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1697.712134] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1697.747828] env[67820]: DEBUG nova.compute.utils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1697.749940] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Not allocating networking since 'none' was specified. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1697.759142] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1697.826026] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1697.847371] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1697.847609] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1697.847762] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1697.847936] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1697.848097] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1697.848244] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1697.848446] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1697.848604] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1697.848769] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1697.848930] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1697.849220] env[67820]: DEBUG nova.virt.hardware [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1697.851142] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0282529-64ac-4f9d-ad9f-6f062e64c83f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.859234] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cff0ffe-9897-4fca-9b05-1f1480c072a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.873045] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance VIF info [] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1697.878563] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Creating folder: Project (928d3ef4d80847aa98cec349f058b914). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1697.878854] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e1ab6546-d60a-40e8-8d5f-d74f9d2f68bc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.888803] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Created folder: Project (928d3ef4d80847aa98cec349f058b914) in parent group-v692668. [ 1697.888989] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Creating folder: Instances. Parent ref: group-v692760. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1697.889813] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cc66972a-ad14-4f9e-adda-fb2f9a790d4e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.898278] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Created folder: Instances in parent group-v692760. [ 1697.898516] env[67820]: DEBUG oslo.service.loopingcall [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1697.898703] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1697.898896] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b8e657f5-5361-43f6-a0d7-aab236ff352a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.915174] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1697.915174] env[67820]: value = "task-3467459" [ 1697.915174] env[67820]: _type = "Task" [ 1697.915174] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1697.925166] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467459, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1698.190918] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ddfd77fc-3d33-4864-bf88-1f6484518ca1 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "5c10efe6-fd80-430e-b647-3eaf2213af78" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1698.191242] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ddfd77fc-3d33-4864-bf88-1f6484518ca1 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "5c10efe6-fd80-430e-b647-3eaf2213af78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1698.425233] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467459, 'name': CreateVM_Task, 'duration_secs': 0.388169} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1698.425439] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1698.425879] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1698.426056] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1698.426416] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1698.426667] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0dbe5408-6a11-4063-aab1-52d9e04f948b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.430989] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Waiting for the task: (returnval){ [ 1698.430989] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52629280-444b-e5e4-bb16-5c4054ef783b" [ 1698.430989] env[67820]: _type = "Task" [ 1698.430989] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1698.439546] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52629280-444b-e5e4-bb16-5c4054ef783b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1698.941317] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1698.941584] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1698.941794] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1726.616697] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1726.640206] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1726.640382] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1726.652286] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] There are 1 instances to clean {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1726.652567] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: a84c5537-9ad1-44d6-b732-fda1156bff86] Instance has had 0 of 5 cleanup attempts {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11211}} [ 1731.669522] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.622325] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.622479] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1733.621736] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.622099] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances with incomplete migration {{(pid=67820) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1735.634158] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1736.615975] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1736.620511] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.621447] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.621839] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.633901] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1737.634130] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1737.634303] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1737.634458] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1737.635570] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e791dcbb-68b4-4d16-8a5e-be070ef86e60 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.644355] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d751e19-e9e3-4b39-9f14-94cf559d1d22 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.657858] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db0aa55b-4447-4dc1-8b34-69c3b1a66210 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.663743] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc7d67b1-13a8-4a03-ad7b-324238cd301e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.691923] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180889MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1737.692073] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1737.692255] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1737.834071] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.834256] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.834385] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.834506] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.834627] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.834744] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.834857] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.834969] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.835094] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.835208] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.846346] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.859208] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.868460] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.880122] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 813af34e-49cc-40a9-a0d0-388a84fde493 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.891673] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.902123] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5c10efe6-fd80-430e-b647-3eaf2213af78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.902325] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1737.902471] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1737.918988] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing inventories for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1737.932670] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating ProviderTree inventory for provider 0f792661-ec04-4fc2-898f-e9860339eddd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1737.932898] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1737.943006] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing aggregate associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, aggregates: None {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1737.959677] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing trait associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, traits: COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1738.127600] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2568ee38-6903-496a-844e-456b2d432060 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.134929] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bff696e-ee9c-4f98-945a-efbb94d7e75f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.164490] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05580388-f950-4533-b7da-9143078808fe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.172172] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2d216b4-eb97-42e2-833a-aedd8af334be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1738.184978] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1738.193273] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1738.212253] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1738.212463] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.520s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1739.212996] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1739.213278] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1739.213351] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1739.235096] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.235332] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.235476] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.235657] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.235896] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.236049] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.236291] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.236291] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.236402] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.236520] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1739.236672] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1739.620899] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1740.621774] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1745.024042] env[67820]: WARNING oslo_vmware.rw_handles [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1745.024042] env[67820]: ERROR oslo_vmware.rw_handles [ 1745.024602] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1745.026486] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1745.026756] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Copying Virtual Disk [datastore1] vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/2d6c21fd-9833-4801-9de7-dcd9cee3527b/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1745.027059] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-daa27803-e119-4de7-b302-2f9f50a41171 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.035080] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1745.035080] env[67820]: value = "task-3467460" [ 1745.035080] env[67820]: _type = "Task" [ 1745.035080] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1745.042236] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467460, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1745.546895] env[67820]: DEBUG oslo_vmware.exceptions [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1745.546895] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1745.547340] env[67820]: ERROR nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1745.547340] env[67820]: Faults: ['InvalidArgument'] [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Traceback (most recent call last): [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] yield resources [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self.driver.spawn(context, instance, image_meta, [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self._fetch_image_if_missing(context, vi) [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] image_cache(vi, tmp_image_ds_loc) [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] vm_util.copy_virtual_disk( [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] session._wait_for_task(vmdk_copy_task) [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] return self.wait_for_task(task_ref) [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] return evt.wait() [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] result = hub.switch() [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] return self.greenlet.switch() [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self.f(*self.args, **self.kw) [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] raise exceptions.translate_fault(task_info.error) [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Faults: ['InvalidArgument'] [ 1745.547340] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] [ 1745.548314] env[67820]: INFO nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Terminating instance [ 1745.549219] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1745.549357] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1745.549582] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15adbd16-f63b-4896-9c0f-41e4bd72cec9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.552031] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1745.552031] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1745.552676] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17f4d4db-eea4-48c1-8ba4-3fbbf727c8ff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.559359] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1745.559565] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6a20bd47-fec2-4ba2-a7ac-c2a35755fb73 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.561901] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1745.562176] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1745.562724] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-42b205fe-a158-405b-aaae-c99166635568 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.567343] env[67820]: DEBUG oslo_vmware.api [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for the task: (returnval){ [ 1745.567343] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52510005-9011-8341-96d1-a6a96e0d4052" [ 1745.567343] env[67820]: _type = "Task" [ 1745.567343] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1745.580850] env[67820]: DEBUG oslo_vmware.api [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52510005-9011-8341-96d1-a6a96e0d4052, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1745.633776] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1745.634013] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1745.634204] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleting the datastore file [datastore1] 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1745.634547] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eb7aef17-0925-4311-bd06-4a8dfb863192 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1745.640839] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 1745.640839] env[67820]: value = "task-3467462" [ 1745.640839] env[67820]: _type = "Task" [ 1745.640839] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1745.648432] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467462, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1746.078087] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1746.078395] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Creating directory with path [datastore1] vmware_temp/cce29d70-61f0-4769-bffe-71808c38b3e5/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1746.078585] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5169b4a7-ce19-48e6-afa4-c452a4a5ca8a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.090251] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Created directory with path [datastore1] vmware_temp/cce29d70-61f0-4769-bffe-71808c38b3e5/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1746.090438] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Fetch image to [datastore1] vmware_temp/cce29d70-61f0-4769-bffe-71808c38b3e5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1746.090606] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/cce29d70-61f0-4769-bffe-71808c38b3e5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1746.091401] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6786660e-9b35-4a4d-8c7a-14dc4ef54491 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.097888] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ac927a8-16be-4383-bdd2-70b5b2565fec {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.107279] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7916ab3a-a286-42a4-8813-6621e9757480 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.136734] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-842962aa-f648-45ac-9e62-a142f96c7040 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.144890] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a1b1963e-0acd-4b71-a95b-e37ccb72a622 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.151645] env[67820]: DEBUG oslo_vmware.api [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467462, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065171} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1746.151645] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1746.151645] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1746.151645] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1746.151645] env[67820]: INFO nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1746.153406] env[67820]: DEBUG nova.compute.claims [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1746.153595] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1746.153833] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1746.173387] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1746.314704] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1746.315529] env[67820]: ERROR nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = getattr(controller, method)(*args, **kwargs) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._get(image_id) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] resp, body = self.http_client.get(url, headers=header) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.request(url, 'GET', **kwargs) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._handle_response(resp) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exc.from_response(resp, resp.content) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] During handling of the above exception, another exception occurred: [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] yield resources [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self.driver.spawn(context, instance, image_meta, [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._fetch_image_if_missing(context, vi) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image_fetch(context, vi, tmp_image_ds_loc) [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] images.fetch_image( [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1746.315529] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] metadata = IMAGE_API.get(context, image_ref) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return session.show(context, image_id, [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] _reraise_translated_image_exception(image_id) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise new_exc.with_traceback(exc_trace) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = getattr(controller, method)(*args, **kwargs) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._get(image_id) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] resp, body = self.http_client.get(url, headers=header) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.request(url, 'GET', **kwargs) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._handle_response(resp) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exc.from_response(resp, resp.content) [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1746.316715] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1746.316715] env[67820]: INFO nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Terminating instance [ 1746.317645] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1746.317879] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1746.318557] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-41bca6d2-9ead-43de-b201-132d1795f9d3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.320972] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1746.321199] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1746.322049] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e43bb617-171c-4bab-a32b-ba61a79763ff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.330268] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1746.331302] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7374b186-f5f4-4c89-9c2b-de30d44aa605 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.332742] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1746.332944] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1746.333692] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-edbc7009-daab-437a-8edb-7b8acc68640d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.340113] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Waiting for the task: (returnval){ [ 1746.340113] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5271978d-a1cc-cc2b-3a87-5ccfb5ee6b4d" [ 1746.340113] env[67820]: _type = "Task" [ 1746.340113] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1746.347836] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5271978d-a1cc-cc2b-3a87-5ccfb5ee6b4d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1746.388748] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-354d7147-af32-465e-9920-4c725996c645 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.395938] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c94b4b6-76a7-43e5-9ae0-5f37b317cccf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.425662] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5752a4bf-4665-432c-8d0c-8acbdcc154ba {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.429219] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1746.429422] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1746.429596] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Deleting the datastore file [datastore1] d3dc6127-8512-4c5c-b04e-b6a639a1d1de {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1746.430171] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-920e1c14-051e-4c85-b6a4-52e9066d620e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.434685] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d058afc-2d6d-44f0-b935-1d670d3dd47f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.439301] env[67820]: DEBUG oslo_vmware.api [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for the task: (returnval){ [ 1746.439301] env[67820]: value = "task-3467464" [ 1746.439301] env[67820]: _type = "Task" [ 1746.439301] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1746.449757] env[67820]: DEBUG nova.compute.provider_tree [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1746.454988] env[67820]: DEBUG oslo_vmware.api [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': task-3467464, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1746.458594] env[67820]: DEBUG nova.scheduler.client.report [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1746.471455] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.318s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1746.471944] env[67820]: ERROR nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1746.471944] env[67820]: Faults: ['InvalidArgument'] [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Traceback (most recent call last): [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self.driver.spawn(context, instance, image_meta, [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self._fetch_image_if_missing(context, vi) [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] image_cache(vi, tmp_image_ds_loc) [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] vm_util.copy_virtual_disk( [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] session._wait_for_task(vmdk_copy_task) [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] return self.wait_for_task(task_ref) [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] return evt.wait() [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] result = hub.switch() [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] return self.greenlet.switch() [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] self.f(*self.args, **self.kw) [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] raise exceptions.translate_fault(task_info.error) [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Faults: ['InvalidArgument'] [ 1746.471944] env[67820]: ERROR nova.compute.manager [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] [ 1746.472711] env[67820]: DEBUG nova.compute.utils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1746.474010] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Build of instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 was re-scheduled: A specified parameter was not correct: fileType [ 1746.474010] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1746.474386] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1746.474577] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1746.474723] env[67820]: DEBUG nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1746.474882] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1746.783797] env[67820]: DEBUG nova.network.neutron [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1746.795841] env[67820]: INFO nova.compute.manager [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Took 0.32 seconds to deallocate network for instance. [ 1746.852689] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1746.853220] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Creating directory with path [datastore1] vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1746.853714] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7452d840-b218-4b21-bc84-f3600b1e3c5f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.867404] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Created directory with path [datastore1] vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1746.867615] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Fetch image to [datastore1] vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1746.867782] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1746.868628] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1978ad28-11c3-4913-a8b1-eda3667935bf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.877025] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecd6dbb6-8249-48c1-8dd4-3cf0a62bdb96 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.889815] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31723f6f-763b-4dda-8368-64cfdceb1e44 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.922810] env[67820]: INFO nova.scheduler.client.report [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted allocations for instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 [ 1746.929243] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1daaefac-94ab-4a27-a9ae-b7d7e294feb6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.936748] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7a029652-1897-476c-a8ec-b3af2efa20e6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.948137] env[67820]: DEBUG oslo_vmware.api [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Task: {'id': task-3467464, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086882} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1746.948390] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1746.948573] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1746.948744] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1746.948914] env[67820]: INFO nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1746.951096] env[67820]: DEBUG oslo_concurrency.lockutils [None req-98202f8f-4f9c-47f7-8098-81f7cef003cf tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 639.788s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1746.951525] env[67820]: DEBUG nova.compute.claims [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1746.951690] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1746.951904] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1746.955111] env[67820]: DEBUG oslo_concurrency.lockutils [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 443.999s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1746.955326] env[67820]: DEBUG oslo_concurrency.lockutils [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1746.955574] env[67820]: DEBUG oslo_concurrency.lockutils [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1746.955732] env[67820]: DEBUG oslo_concurrency.lockutils [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1746.957406] env[67820]: INFO nova.compute.manager [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Terminating instance [ 1746.959309] env[67820]: DEBUG nova.compute.manager [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1746.959499] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1746.959955] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-857a41ea-910f-4313-ba4e-13b2d64f16d6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1746.963353] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1746.966068] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1746.975019] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1f6ce50-d40d-411f-8a4c-d0fdd0e65b01 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.001318] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5 could not be found. [ 1747.001551] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1747.001725] env[67820]: INFO nova.compute.manager [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1747.004073] env[67820]: DEBUG oslo.service.loopingcall [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1747.005411] env[67820]: DEBUG nova.compute.manager [-] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1747.005508] env[67820]: DEBUG nova.network.neutron [-] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1747.036583] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1747.038264] env[67820]: DEBUG oslo_vmware.rw_handles [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1747.094769] env[67820]: DEBUG nova.network.neutron [-] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1747.101392] env[67820]: DEBUG oslo_vmware.rw_handles [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1747.101587] env[67820]: DEBUG oslo_vmware.rw_handles [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1747.104419] env[67820]: INFO nova.compute.manager [-] [instance: 3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5] Took 0.10 seconds to deallocate network for instance. [ 1747.197331] env[67820]: DEBUG oslo_concurrency.lockutils [None req-529388ad-b9cd-4159-8599-2d645153b8dc tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "3ae1f86b-f28c-4f0b-b129-127e3ed6a5f5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.242s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.280872] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8d577bb-b7f5-4c8f-be4a-c90de92408b6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.288978] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7a23d22-a4e6-44f4-bcbd-6d27bd6bae14 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.320239] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ab021d-d7e3-4195-a0e6-976436363911 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.327957] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d6ecea4-1e0b-486c-95b1-2a7f816ec6f1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.342442] env[67820]: DEBUG nova.compute.provider_tree [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1747.351308] env[67820]: DEBUG nova.scheduler.client.report [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1747.367444] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.415s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.368178] env[67820]: ERROR nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = getattr(controller, method)(*args, **kwargs) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._get(image_id) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] resp, body = self.http_client.get(url, headers=header) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.request(url, 'GET', **kwargs) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._handle_response(resp) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exc.from_response(resp, resp.content) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] During handling of the above exception, another exception occurred: [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self.driver.spawn(context, instance, image_meta, [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._fetch_image_if_missing(context, vi) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image_fetch(context, vi, tmp_image_ds_loc) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] images.fetch_image( [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] metadata = IMAGE_API.get(context, image_ref) [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1747.368178] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return session.show(context, image_id, [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] _reraise_translated_image_exception(image_id) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise new_exc.with_traceback(exc_trace) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = getattr(controller, method)(*args, **kwargs) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._get(image_id) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] resp, body = self.http_client.get(url, headers=header) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.request(url, 'GET', **kwargs) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._handle_response(resp) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exc.from_response(resp, resp.content) [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1747.369268] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.369268] env[67820]: DEBUG nova.compute.utils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1747.369994] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.333s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.371272] env[67820]: INFO nova.compute.claims [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1747.374081] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Build of instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de was re-scheduled: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1747.374544] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1747.374713] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1747.374875] env[67820]: DEBUG nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1747.375046] env[67820]: DEBUG nova.network.neutron [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1747.480494] env[67820]: DEBUG neutronclient.v2_0.client [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67820) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1747.481729] env[67820]: ERROR nova.compute.manager [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = getattr(controller, method)(*args, **kwargs) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._get(image_id) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] resp, body = self.http_client.get(url, headers=header) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.request(url, 'GET', **kwargs) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._handle_response(resp) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exc.from_response(resp, resp.content) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] During handling of the above exception, another exception occurred: [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self.driver.spawn(context, instance, image_meta, [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._fetch_image_if_missing(context, vi) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image_fetch(context, vi, tmp_image_ds_loc) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] images.fetch_image( [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] metadata = IMAGE_API.get(context, image_ref) [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1747.481729] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return session.show(context, image_id, [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] _reraise_translated_image_exception(image_id) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise new_exc.with_traceback(exc_trace) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = getattr(controller, method)(*args, **kwargs) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._get(image_id) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] resp, body = self.http_client.get(url, headers=header) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.request(url, 'GET', **kwargs) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self._handle_response(resp) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exc.from_response(resp, resp.content) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] nova.exception.ImageNotAuthorized: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] During handling of the above exception, another exception occurred: [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._build_and_run_instance(context, instance, image, [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exception.RescheduledException( [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] nova.exception.RescheduledException: Build of instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de was re-scheduled: Not authorized for image 4407539e-b292-42b4-91c4-4faa60d48bab. [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] During handling of the above exception, another exception occurred: [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] exception_handler_v20(status_code, error_body) [ 1747.482853] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise client_exc(message=error_message, [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Neutron server returns request_ids: ['req-cb32d491-5735-45a8-899f-40e52f92b0ef'] [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] During handling of the above exception, another exception occurred: [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._deallocate_network(context, instance, requested_networks) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self.network_api.deallocate_for_instance( [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] data = neutron.list_ports(**search_opts) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.list('ports', self.ports_path, retrieve_all, [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] for r in self._pagination(collection, path, **params): [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] res = self.get(path, params=params) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.retry_request("GET", action, body=body, [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.do_request(method, action, body=body, [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._handle_fault_response(status_code, replybody, resp) [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exception.Unauthorized() [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] nova.exception.Unauthorized: Not authorized. [ 1747.484041] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.554659] env[67820]: INFO nova.scheduler.client.report [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Deleted allocations for instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de [ 1747.576738] env[67820]: DEBUG oslo_concurrency.lockutils [None req-803ff980-9f76-4d7e-8a6e-80684e70846b tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 572.422s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.577550] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 376.421s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.577775] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Acquiring lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1747.577985] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.578165] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.580016] env[67820]: INFO nova.compute.manager [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Terminating instance [ 1747.581625] env[67820]: DEBUG nova.compute.manager [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1747.581808] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1747.582277] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-64194baa-f6c3-49bb-afd7-76db96285483 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.591285] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2606a3f-18bf-490b-bf59-656c684c1aea {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.604111] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1747.625053] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d3dc6127-8512-4c5c-b04e-b6a639a1d1de could not be found. [ 1747.625310] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1747.625544] env[67820]: INFO nova.compute.manager [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1747.625875] env[67820]: DEBUG oslo.service.loopingcall [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1747.628411] env[67820]: DEBUG nova.compute.manager [-] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1747.628545] env[67820]: DEBUG nova.network.neutron [-] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1747.642016] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f954cb0-6b1f-44f5-9b32-2cb7badda8bf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.649413] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a83e91b2-304d-4a6f-ad72-30636d33f03e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.654789] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1747.683866] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aaf0433-8e70-4ce1-bd4c-c1c03e73bd20 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.690232] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06e5ccb3-115a-4a29-8668-39786089cc2e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.703819] env[67820]: DEBUG nova.compute.provider_tree [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1747.715849] env[67820]: DEBUG nova.scheduler.client.report [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1747.731500] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.731974] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1747.734209] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.079s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1747.735524] env[67820]: INFO nova.compute.claims [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1747.764054] env[67820]: DEBUG nova.compute.utils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1747.765785] env[67820]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67820) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1747.766711] env[67820]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-3963d8b6-7f0a-49e1-97ac-c86d5754e9d4'] [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1747.767358] env[67820]: ERROR oslo.service.loopingcall [ 1747.768784] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1747.768784] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1747.770499] env[67820]: ERROR nova.compute.manager [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1747.777074] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1747.826738] env[67820]: ERROR nova.compute.manager [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] exception_handler_v20(status_code, error_body) [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise client_exc(message=error_message, [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Neutron server returns request_ids: ['req-3963d8b6-7f0a-49e1-97ac-c86d5754e9d4'] [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] During handling of the above exception, another exception occurred: [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Traceback (most recent call last): [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._delete_instance(context, instance, bdms) [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._shutdown_instance(context, instance, bdms) [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._try_deallocate_network(context, instance, requested_networks) [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] with excutils.save_and_reraise_exception(): [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self.force_reraise() [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise self.value [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] _deallocate_network_with_retries() [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return evt.wait() [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = hub.switch() [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.greenlet.switch() [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = func(*self.args, **self.kw) [ 1747.826738] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] result = f(*args, **kwargs) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._deallocate_network( [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self.network_api.deallocate_for_instance( [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] data = neutron.list_ports(**search_opts) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.list('ports', self.ports_path, retrieve_all, [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] for r in self._pagination(collection, path, **params): [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] res = self.get(path, params=params) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.retry_request("GET", action, body=body, [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] return self.do_request(method, action, body=body, [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] ret = obj(*args, **kwargs) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] self._handle_fault_response(status_code, replybody, resp) [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1747.827783] env[67820]: ERROR nova.compute.manager [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] [ 1747.855100] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1747.861841] env[67820]: DEBUG nova.policy [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ef0be697ca848f4a39984e43ef9a396', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dae13838d30c46c0a67d0ed608b13558', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1747.864379] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Lock "d3dc6127-8512-4c5c-b04e-b6a639a1d1de" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.286s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1747.884036] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1747.884036] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1747.884036] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1747.884264] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1747.884347] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1747.884494] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1747.884702] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1747.884862] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1747.885141] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1747.885395] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1747.885809] env[67820]: DEBUG nova.virt.hardware [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1747.887600] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6a5c982-d9f1-4de9-a4d1-d1390a628680 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.902803] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ec900f8-8e52-4e5f-987b-430b97e74a93 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1747.923121] env[67820]: INFO nova.compute.manager [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] [instance: d3dc6127-8512-4c5c-b04e-b6a639a1d1de] Successfully reverted task state from None on failure for instance. [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server [None req-1ff63dba-6ac5-4507-bb6e-0564f8fdd3d1 tempest-ServersTestMultiNic-1148482182 tempest-ServersTestMultiNic-1148482182-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-3963d8b6-7f0a-49e1-97ac-c86d5754e9d4'] [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1747.926723] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server raise self.value [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1747.928352] env[67820]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1747.929903] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1747.929903] env[67820]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1747.929903] env[67820]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1747.929903] env[67820]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1747.929903] env[67820]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1747.929903] env[67820]: ERROR oslo_messaging.rpc.server [ 1748.010184] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5b85977-2b4d-4c96-b130-201f2bd57baf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.018170] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b275ecb4-2ab0-42ba-a83e-4f65f7d461cb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.047989] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f3e9db8-3cf5-4e10-8869-e1cec5742104 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.055043] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c08d53d-12a4-458f-bfe4-43362b7834cf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.068173] env[67820]: DEBUG nova.compute.provider_tree [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1748.076428] env[67820]: DEBUG nova.scheduler.client.report [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1748.091613] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.357s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1748.092124] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1748.130926] env[67820]: DEBUG nova.compute.utils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1748.132202] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1748.132372] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1748.144798] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1748.191019] env[67820]: DEBUG nova.policy [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1c3af76995642bd8e9efed393e76655', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2bbd98fc847f441ab80f175642f3d12a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1748.224740] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1748.245917] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Successfully created port: 485172a5-5249-4ab9-bebc-5a25b5c83de8 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1748.251319] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1748.251618] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1748.251710] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1748.252042] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1748.252285] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1748.252360] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1748.252554] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1748.252710] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1748.252871] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1748.253065] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1748.253318] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1748.254145] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-221bd40b-1e78-4070-b3c3-a8cf3e0fbc29 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.262940] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1765a9c4-e5bb-4b11-a258-d61c73d78812 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1748.551330] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Successfully created port: c767b5f0-ad32-439a-9fc8-7d7999f9b037 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1748.877422] env[67820]: DEBUG nova.compute.manager [req-13393467-6df2-44df-8a79-4c3983f89aea req-40ed802d-2ec4-44a5-a5a7-d189e6c4438a service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Received event network-vif-plugged-485172a5-5249-4ab9-bebc-5a25b5c83de8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1748.877689] env[67820]: DEBUG oslo_concurrency.lockutils [req-13393467-6df2-44df-8a79-4c3983f89aea req-40ed802d-2ec4-44a5-a5a7-d189e6c4438a service nova] Acquiring lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1748.877940] env[67820]: DEBUG oslo_concurrency.lockutils [req-13393467-6df2-44df-8a79-4c3983f89aea req-40ed802d-2ec4-44a5-a5a7-d189e6c4438a service nova] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1748.878207] env[67820]: DEBUG oslo_concurrency.lockutils [req-13393467-6df2-44df-8a79-4c3983f89aea req-40ed802d-2ec4-44a5-a5a7-d189e6c4438a service nova] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1748.878430] env[67820]: DEBUG nova.compute.manager [req-13393467-6df2-44df-8a79-4c3983f89aea req-40ed802d-2ec4-44a5-a5a7-d189e6c4438a service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] No waiting events found dispatching network-vif-plugged-485172a5-5249-4ab9-bebc-5a25b5c83de8 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1748.878603] env[67820]: WARNING nova.compute.manager [req-13393467-6df2-44df-8a79-4c3983f89aea req-40ed802d-2ec4-44a5-a5a7-d189e6c4438a service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Received unexpected event network-vif-plugged-485172a5-5249-4ab9-bebc-5a25b5c83de8 for instance with vm_state building and task_state spawning. [ 1748.958909] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Successfully updated port: 485172a5-5249-4ab9-bebc-5a25b5c83de8 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1748.971990] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "refresh_cache-1694799a-76d6-4e3e-83e1-5e2e4ad486d4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1748.972085] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquired lock "refresh_cache-1694799a-76d6-4e3e-83e1-5e2e4ad486d4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1748.972180] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1749.013483] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1749.100105] env[67820]: DEBUG nova.compute.manager [req-ec03ae39-655d-4a2a-99d2-404e7f2f317d req-eb8985a0-f7ef-4d10-9d98-4b4771f1522d service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Received event network-vif-plugged-c767b5f0-ad32-439a-9fc8-7d7999f9b037 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1749.100347] env[67820]: DEBUG oslo_concurrency.lockutils [req-ec03ae39-655d-4a2a-99d2-404e7f2f317d req-eb8985a0-f7ef-4d10-9d98-4b4771f1522d service nova] Acquiring lock "c29a702f-67df-47d3-84ed-0cbd3b430c48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1749.100564] env[67820]: DEBUG oslo_concurrency.lockutils [req-ec03ae39-655d-4a2a-99d2-404e7f2f317d req-eb8985a0-f7ef-4d10-9d98-4b4771f1522d service nova] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1749.100739] env[67820]: DEBUG oslo_concurrency.lockutils [req-ec03ae39-655d-4a2a-99d2-404e7f2f317d req-eb8985a0-f7ef-4d10-9d98-4b4771f1522d service nova] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1749.100912] env[67820]: DEBUG nova.compute.manager [req-ec03ae39-655d-4a2a-99d2-404e7f2f317d req-eb8985a0-f7ef-4d10-9d98-4b4771f1522d service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] No waiting events found dispatching network-vif-plugged-c767b5f0-ad32-439a-9fc8-7d7999f9b037 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1749.101128] env[67820]: WARNING nova.compute.manager [req-ec03ae39-655d-4a2a-99d2-404e7f2f317d req-eb8985a0-f7ef-4d10-9d98-4b4771f1522d service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Received unexpected event network-vif-plugged-c767b5f0-ad32-439a-9fc8-7d7999f9b037 for instance with vm_state building and task_state spawning. [ 1749.229819] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Successfully updated port: c767b5f0-ad32-439a-9fc8-7d7999f9b037 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1749.241817] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "refresh_cache-c29a702f-67df-47d3-84ed-0cbd3b430c48" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1749.241970] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired lock "refresh_cache-c29a702f-67df-47d3-84ed-0cbd3b430c48" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1749.242143] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1749.280837] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1749.410070] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Updating instance_info_cache with network_info: [{"id": "485172a5-5249-4ab9-bebc-5a25b5c83de8", "address": "fa:16:3e:34:c4:08", "network": {"id": "4eda665a-2076-4195-905a-88a887ed43dd", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1330770507-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dae13838d30c46c0a67d0ed608b13558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2a75bb6e-6331-4429-b1b9-c968cc22b9c9", "external-id": "nsx-vlan-transportzone-244", "segmentation_id": 244, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap485172a5-52", "ovs_interfaceid": "485172a5-5249-4ab9-bebc-5a25b5c83de8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1749.421322] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Releasing lock "refresh_cache-1694799a-76d6-4e3e-83e1-5e2e4ad486d4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1749.421610] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Instance network_info: |[{"id": "485172a5-5249-4ab9-bebc-5a25b5c83de8", "address": "fa:16:3e:34:c4:08", "network": {"id": "4eda665a-2076-4195-905a-88a887ed43dd", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1330770507-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dae13838d30c46c0a67d0ed608b13558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2a75bb6e-6331-4429-b1b9-c968cc22b9c9", "external-id": "nsx-vlan-transportzone-244", "segmentation_id": 244, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap485172a5-52", "ovs_interfaceid": "485172a5-5249-4ab9-bebc-5a25b5c83de8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1749.421999] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:c4:08', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2a75bb6e-6331-4429-b1b9-c968cc22b9c9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '485172a5-5249-4ab9-bebc-5a25b5c83de8', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1749.429858] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Creating folder: Project (dae13838d30c46c0a67d0ed608b13558). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1749.430379] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dd571b0f-d7e6-43f1-8b4a-7d72b9abec39 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.440751] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Created folder: Project (dae13838d30c46c0a67d0ed608b13558) in parent group-v692668. [ 1749.442042] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Creating folder: Instances. Parent ref: group-v692763. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1749.442042] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a1cbc418-11b9-4326-8b5e-7cb3094c0abf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.447549] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Updating instance_info_cache with network_info: [{"id": "c767b5f0-ad32-439a-9fc8-7d7999f9b037", "address": "fa:16:3e:56:cc:2d", "network": {"id": "764c9f14-f47a-4e5c-8563-6c7eb732d84b", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1064641058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bbd98fc847f441ab80f175642f3d12a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc767b5f0-ad", "ovs_interfaceid": "c767b5f0-ad32-439a-9fc8-7d7999f9b037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1749.451412] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Created folder: Instances in parent group-v692763. [ 1749.451629] env[67820]: DEBUG oslo.service.loopingcall [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1749.451801] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1749.451983] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ba073f0e-4f28-406d-ba69-0c2a79787b1a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.466689] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Releasing lock "refresh_cache-c29a702f-67df-47d3-84ed-0cbd3b430c48" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1749.466971] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Instance network_info: |[{"id": "c767b5f0-ad32-439a-9fc8-7d7999f9b037", "address": "fa:16:3e:56:cc:2d", "network": {"id": "764c9f14-f47a-4e5c-8563-6c7eb732d84b", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1064641058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bbd98fc847f441ab80f175642f3d12a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc767b5f0-ad", "ovs_interfaceid": "c767b5f0-ad32-439a-9fc8-7d7999f9b037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1749.467811] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:56:cc:2d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5ba27300-88df-4c95-b9e0-a4a8b5039c3c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c767b5f0-ad32-439a-9fc8-7d7999f9b037', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1749.475025] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating folder: Project (2bbd98fc847f441ab80f175642f3d12a). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1749.476371] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35d50b2c-9f67-4630-a126-7fdb94224a22 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.477889] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1749.477889] env[67820]: value = "task-3467467" [ 1749.477889] env[67820]: _type = "Task" [ 1749.477889] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1749.486394] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467467, 'name': CreateVM_Task} progress is 5%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1749.487586] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Created folder: Project (2bbd98fc847f441ab80f175642f3d12a) in parent group-v692668. [ 1749.487759] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating folder: Instances. Parent ref: group-v692765. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1749.488130] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-baab95e1-d3ec-4d10-86fa-16f645454389 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.497442] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Created folder: Instances in parent group-v692765. [ 1749.497652] env[67820]: DEBUG oslo.service.loopingcall [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1749.497825] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1749.498180] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-95bea940-affa-4f37-9ca2-2217311e9492 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.516986] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1749.516986] env[67820]: value = "task-3467470" [ 1749.516986] env[67820]: _type = "Task" [ 1749.516986] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1749.523930] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467470, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1749.991016] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467467, 'name': CreateVM_Task, 'duration_secs': 0.325939} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1749.991016] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1749.991016] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1749.991016] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1749.991016] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1749.991016] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-06bc0077-8ffa-4b44-8d6c-6396d7bc42d8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1749.995090] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Waiting for the task: (returnval){ [ 1749.995090] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5262a1dd-f832-7379-c4f8-8727dcd8986b" [ 1749.995090] env[67820]: _type = "Task" [ 1749.995090] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.003805] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5262a1dd-f832-7379-c4f8-8727dcd8986b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.025167] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467470, 'name': CreateVM_Task, 'duration_secs': 0.314629} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1750.025313] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1750.025930] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1750.505788] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1750.506099] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1750.506268] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1750.506484] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1750.506805] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1750.507061] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-adec1e67-b6b6-4846-a392-18c9fca058ac {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.511361] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 1750.511361] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5274dc1a-794b-f285-a14e-89b202c8b3c3" [ 1750.511361] env[67820]: _type = "Task" [ 1750.511361] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1750.519339] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5274dc1a-794b-f285-a14e-89b202c8b3c3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1750.569961] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1750.591318] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Getting list of instances from cluster (obj){ [ 1750.591318] env[67820]: value = "domain-c8" [ 1750.591318] env[67820]: _type = "ClusterComputeResource" [ 1750.591318] env[67820]: } {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1750.592552] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-823e26dc-8fcd-4712-b4a3-f37656c402e3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.609072] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Got total of 10 instances {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1750.609248] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.609433] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 11320faf-fa01-49c8-9d96-af9a4f6c5095 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.609591] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid e401a9ad-d6ed-4511-936c-4cf36d41281b {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.609744] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid bb25ada4-c7fe-47a4-b784-b66f50c8e9eb {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.609895] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 0cda1de0-73dd-45dd-932b-75e59fb785cf {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.610053] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 99f872a5-2e7d-42b9-a94f-67153db8d0ad {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.610204] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 276123c5-3edc-4e33-9b13-baae0fc9de9f {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.610348] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 1834f1ac-f85c-4176-b3c3-e85d50561b4a {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.610491] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.610636] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Triggering sync for uuid c29a702f-67df-47d3-84ed-0cbd3b430c48 {{(pid=67820) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1750.610961] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.611205] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.611404] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.611597] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.611788] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.611991] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.612209] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.612401] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.612587] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.612773] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.902388] env[67820]: DEBUG nova.compute.manager [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Received event network-changed-485172a5-5249-4ab9-bebc-5a25b5c83de8 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1750.902524] env[67820]: DEBUG nova.compute.manager [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Refreshing instance network info cache due to event network-changed-485172a5-5249-4ab9-bebc-5a25b5c83de8. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1750.902739] env[67820]: DEBUG oslo_concurrency.lockutils [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] Acquiring lock "refresh_cache-1694799a-76d6-4e3e-83e1-5e2e4ad486d4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1750.902878] env[67820]: DEBUG oslo_concurrency.lockutils [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] Acquired lock "refresh_cache-1694799a-76d6-4e3e-83e1-5e2e4ad486d4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1750.903610] env[67820]: DEBUG nova.network.neutron [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Refreshing network info cache for port 485172a5-5249-4ab9-bebc-5a25b5c83de8 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1751.022744] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1751.022967] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1751.023200] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1751.142569] env[67820]: DEBUG nova.compute.manager [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Received event network-changed-c767b5f0-ad32-439a-9fc8-7d7999f9b037 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1751.142773] env[67820]: DEBUG nova.compute.manager [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Refreshing instance network info cache due to event network-changed-c767b5f0-ad32-439a-9fc8-7d7999f9b037. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1751.142984] env[67820]: DEBUG oslo_concurrency.lockutils [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] Acquiring lock "refresh_cache-c29a702f-67df-47d3-84ed-0cbd3b430c48" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1751.143141] env[67820]: DEBUG oslo_concurrency.lockutils [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] Acquired lock "refresh_cache-c29a702f-67df-47d3-84ed-0cbd3b430c48" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1751.143316] env[67820]: DEBUG nova.network.neutron [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Refreshing network info cache for port c767b5f0-ad32-439a-9fc8-7d7999f9b037 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1751.396759] env[67820]: DEBUG nova.network.neutron [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Updated VIF entry in instance network info cache for port c767b5f0-ad32-439a-9fc8-7d7999f9b037. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1751.397416] env[67820]: DEBUG nova.network.neutron [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Updating instance_info_cache with network_info: [{"id": "c767b5f0-ad32-439a-9fc8-7d7999f9b037", "address": "fa:16:3e:56:cc:2d", "network": {"id": "764c9f14-f47a-4e5c-8563-6c7eb732d84b", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1064641058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bbd98fc847f441ab80f175642f3d12a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc767b5f0-ad", "ovs_interfaceid": "c767b5f0-ad32-439a-9fc8-7d7999f9b037", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1751.406884] env[67820]: DEBUG oslo_concurrency.lockutils [req-5f3902f2-63fc-4634-b2a0-c17296ee1a42 req-b3bf43f7-e09c-498d-867a-7faf2955da9f service nova] Releasing lock "refresh_cache-c29a702f-67df-47d3-84ed-0cbd3b430c48" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1751.424962] env[67820]: DEBUG nova.network.neutron [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Updated VIF entry in instance network info cache for port 485172a5-5249-4ab9-bebc-5a25b5c83de8. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1751.425372] env[67820]: DEBUG nova.network.neutron [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Updating instance_info_cache with network_info: [{"id": "485172a5-5249-4ab9-bebc-5a25b5c83de8", "address": "fa:16:3e:34:c4:08", "network": {"id": "4eda665a-2076-4195-905a-88a887ed43dd", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1330770507-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dae13838d30c46c0a67d0ed608b13558", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2a75bb6e-6331-4429-b1b9-c968cc22b9c9", "external-id": "nsx-vlan-transportzone-244", "segmentation_id": 244, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap485172a5-52", "ovs_interfaceid": "485172a5-5249-4ab9-bebc-5a25b5c83de8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1751.435739] env[67820]: DEBUG oslo_concurrency.lockutils [req-c5e1c48c-a78d-4543-9375-b028c190aff2 req-d21138dc-96cd-4922-a65b-5f4522f15ed4 service nova] Releasing lock "refresh_cache-1694799a-76d6-4e3e-83e1-5e2e4ad486d4" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1755.924570] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquiring lock "fffda39c-1960-49f9-a26b-6b87e2c3c53e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1755.924917] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Lock "fffda39c-1960-49f9-a26b-6b87e2c3c53e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1765.957085] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1784.953934] env[67820]: DEBUG oslo_concurrency.lockutils [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1792.665465] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1793.113857] env[67820]: WARNING oslo_vmware.rw_handles [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1793.113857] env[67820]: ERROR oslo_vmware.rw_handles [ 1793.114446] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1793.116388] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1793.116631] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Copying Virtual Disk [datastore1] vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/c2082e07-fa45-4605-a848-9235f46ebc14/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1793.116909] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9e076c83-56a9-413f-bfea-f68b24e71b00 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.124088] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Waiting for the task: (returnval){ [ 1793.124088] env[67820]: value = "task-3467471" [ 1793.124088] env[67820]: _type = "Task" [ 1793.124088] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1793.131796] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Task: {'id': task-3467471, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1793.621639] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1793.621826] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1793.635049] env[67820]: DEBUG oslo_vmware.exceptions [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1793.635049] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1793.635049] env[67820]: ERROR nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1793.635049] env[67820]: Faults: ['InvalidArgument'] [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Traceback (most recent call last): [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] yield resources [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self.driver.spawn(context, instance, image_meta, [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self._fetch_image_if_missing(context, vi) [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] image_cache(vi, tmp_image_ds_loc) [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] vm_util.copy_virtual_disk( [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] session._wait_for_task(vmdk_copy_task) [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] return self.wait_for_task(task_ref) [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] return evt.wait() [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] result = hub.switch() [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] return self.greenlet.switch() [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self.f(*self.args, **self.kw) [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] raise exceptions.translate_fault(task_info.error) [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Faults: ['InvalidArgument'] [ 1793.635049] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] [ 1793.636194] env[67820]: INFO nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Terminating instance [ 1793.636793] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1793.637060] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1793.637308] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8bbfb10a-2312-4c74-80b0-a33720134169 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.639466] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1793.639662] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1793.640383] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38cae090-4528-4a85-8881-098df42c2869 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.648447] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1793.648656] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c9e9d77c-c64b-4985-8202-2141bb24afee {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.650738] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1793.650902] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1793.651924] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb8a3ac9-f203-4db6-8fe3-3650cf0c8e2d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.656708] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for the task: (returnval){ [ 1793.656708] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5228ae13-1541-5990-d02b-ea60ce7265cd" [ 1793.656708] env[67820]: _type = "Task" [ 1793.656708] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1793.664090] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5228ae13-1541-5990-d02b-ea60ce7265cd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1793.711115] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1793.711477] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1793.711569] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Deleting the datastore file [datastore1] 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1793.711777] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-74472af9-8eca-43a2-9cae-f8fc7cd0db6b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1793.718173] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Waiting for the task: (returnval){ [ 1793.718173] env[67820]: value = "task-3467473" [ 1793.718173] env[67820]: _type = "Task" [ 1793.718173] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1793.725820] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Task: {'id': task-3467473, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1794.166628] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1794.166850] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Creating directory with path [datastore1] vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1794.167144] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a24a871e-25da-4d13-8584-004f05ff3572 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.178151] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Created directory with path [datastore1] vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1794.178346] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Fetch image to [datastore1] vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1794.178668] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1794.179255] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f74dc83-9f08-4865-8015-5d2f863f7758 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.185884] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-521110ba-b8b3-4ab5-b08a-43d1506cc88a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.195487] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-066ae0cd-a048-4030-bf3f-7bf519b361c2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.228962] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18fd5c73-40d4-4518-9240-c375ee15ce7d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.236215] env[67820]: DEBUG oslo_vmware.api [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Task: {'id': task-3467473, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07602} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1794.237690] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1794.238024] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1794.238176] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1794.238358] env[67820]: INFO nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1794.239971] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9b4bb5b8-f697-4c58-9c47-2e765a75492b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.241828] env[67820]: DEBUG nova.compute.claims [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1794.242037] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1794.242240] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1794.263991] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1794.317996] env[67820]: DEBUG oslo_vmware.rw_handles [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1794.377787] env[67820]: DEBUG oslo_vmware.rw_handles [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1794.377976] env[67820]: DEBUG oslo_vmware.rw_handles [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1794.511576] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa17d1d7-eb21-4fc1-b3fc-380af3a38976 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.518693] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa47afe7-a178-4df3-9174-2cebf5fc2a4e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.547618] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-534d8b80-6ab0-478d-81d4-4acc7cab4d9d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.554099] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fc8965f-82fd-48de-a62c-8c6c539f5a29 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1794.566756] env[67820]: DEBUG nova.compute.provider_tree [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1794.576470] env[67820]: DEBUG nova.scheduler.client.report [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1794.590026] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.348s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1794.590551] env[67820]: ERROR nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1794.590551] env[67820]: Faults: ['InvalidArgument'] [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Traceback (most recent call last): [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self.driver.spawn(context, instance, image_meta, [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self._fetch_image_if_missing(context, vi) [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] image_cache(vi, tmp_image_ds_loc) [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] vm_util.copy_virtual_disk( [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] session._wait_for_task(vmdk_copy_task) [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] return self.wait_for_task(task_ref) [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] return evt.wait() [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] result = hub.switch() [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] return self.greenlet.switch() [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] self.f(*self.args, **self.kw) [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] raise exceptions.translate_fault(task_info.error) [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Faults: ['InvalidArgument'] [ 1794.590551] env[67820]: ERROR nova.compute.manager [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] [ 1794.591713] env[67820]: DEBUG nova.compute.utils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1794.592606] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Build of instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b was re-scheduled: A specified parameter was not correct: fileType [ 1794.592606] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1794.592958] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1794.593168] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1794.593353] env[67820]: DEBUG nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1794.593516] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1794.998928] env[67820]: DEBUG nova.network.neutron [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1795.011077] env[67820]: INFO nova.compute.manager [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Took 0.42 seconds to deallocate network for instance. [ 1795.109766] env[67820]: INFO nova.scheduler.client.report [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Deleted allocations for instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b [ 1795.131890] env[67820]: DEBUG oslo_concurrency.lockutils [None req-261adb6d-025b-4c6b-9d04-ea581488bd63 tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 614.963s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.133068] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 419.275s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.133296] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Acquiring lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.133505] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.133667] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.136104] env[67820]: INFO nova.compute.manager [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Terminating instance [ 1795.140482] env[67820]: DEBUG nova.compute.manager [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1795.140672] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1795.141070] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-207f6ddf-905a-49ad-ab65-14fef19f1d9d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.144816] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1795.151608] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcf3eec0-6230-49fb-9db3-c9a72b2d03cf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.184994] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b could not be found. [ 1795.185226] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1795.185402] env[67820]: INFO nova.compute.manager [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1795.185639] env[67820]: DEBUG oslo.service.loopingcall [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1795.187864] env[67820]: DEBUG nova.compute.manager [-] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1795.187965] env[67820]: DEBUG nova.network.neutron [-] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1795.202775] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1795.203027] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.204445] env[67820]: INFO nova.compute.claims [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1795.213409] env[67820]: DEBUG nova.network.neutron [-] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1795.231625] env[67820]: INFO nova.compute.manager [-] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] Took 0.04 seconds to deallocate network for instance. [ 1795.323042] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b2c3cd0a-9e21-4b6e-b10d-bc45dfdb9beb tempest-ServerTagsTestJSON-1883678193 tempest-ServerTagsTestJSON-1883678193-project-member] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.190s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.323898] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 44.713s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1795.324153] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1795.324342] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "8fd1dcdf-eaef-473f-8a57-c25bfcd35e1b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.424079] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a257a8ac-a6e5-4b4d-87bc-0f51793d5e97 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.431605] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba4965d0-aece-45a8-9d64-6ba6bd4a1d7b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.460808] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a74a286-3f04-49cc-a832-50824bbd8917 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.467623] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d0b9051-42a5-4917-9be6-0b95c8c5ae5d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.481601] env[67820]: DEBUG nova.compute.provider_tree [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1795.490609] env[67820]: DEBUG nova.scheduler.client.report [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1795.504993] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1795.505463] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1795.541911] env[67820]: DEBUG nova.compute.utils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1795.543982] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1795.543982] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1795.554274] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1795.615131] env[67820]: DEBUG nova.policy [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1c3af76995642bd8e9efed393e76655', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2bbd98fc847f441ab80f175642f3d12a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1795.619795] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1795.648316] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1795.648551] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1795.648709] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1795.648884] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1795.649041] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1795.649193] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1795.649400] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1795.649556] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1795.649719] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1795.649879] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1795.650061] env[67820]: DEBUG nova.virt.hardware [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1795.650999] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23429f77-81ee-4017-a2d3-7ded3ba1237e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.659314] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cad308b2-42c2-482e-a64f-c9aaa4fad3be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1795.940668] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Successfully created port: 46e935b7-31a5-4169-9482-ce1c49d0ce26 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1796.617091] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1796.647178] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Successfully updated port: 46e935b7-31a5-4169-9482-ce1c49d0ce26 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1796.662588] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "refresh_cache-2965c630-07c6-4e08-a5ab-4996d4c72b82" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1796.662793] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired lock "refresh_cache-2965c630-07c6-4e08-a5ab-4996d4c72b82" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1796.662874] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1796.723994] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1796.985679] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Updating instance_info_cache with network_info: [{"id": "46e935b7-31a5-4169-9482-ce1c49d0ce26", "address": "fa:16:3e:f7:3e:ff", "network": {"id": "764c9f14-f47a-4e5c-8563-6c7eb732d84b", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1064641058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bbd98fc847f441ab80f175642f3d12a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46e935b7-31", "ovs_interfaceid": "46e935b7-31a5-4169-9482-ce1c49d0ce26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1796.996956] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Releasing lock "refresh_cache-2965c630-07c6-4e08-a5ab-4996d4c72b82" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1796.997294] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Instance network_info: |[{"id": "46e935b7-31a5-4169-9482-ce1c49d0ce26", "address": "fa:16:3e:f7:3e:ff", "network": {"id": "764c9f14-f47a-4e5c-8563-6c7eb732d84b", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1064641058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bbd98fc847f441ab80f175642f3d12a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46e935b7-31", "ovs_interfaceid": "46e935b7-31a5-4169-9482-ce1c49d0ce26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1796.997672] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f7:3e:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5ba27300-88df-4c95-b9e0-a4a8b5039c3c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '46e935b7-31a5-4169-9482-ce1c49d0ce26', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1797.005363] env[67820]: DEBUG oslo.service.loopingcall [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1797.005834] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1797.006134] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f0cdd104-df33-4948-aa42-ca113f57e0be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.027521] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1797.027521] env[67820]: value = "task-3467474" [ 1797.027521] env[67820]: _type = "Task" [ 1797.027521] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1797.034231] env[67820]: DEBUG nova.compute.manager [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Received event network-vif-plugged-46e935b7-31a5-4169-9482-ce1c49d0ce26 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1797.034585] env[67820]: DEBUG oslo_concurrency.lockutils [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] Acquiring lock "2965c630-07c6-4e08-a5ab-4996d4c72b82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1797.034674] env[67820]: DEBUG oslo_concurrency.lockutils [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1797.034822] env[67820]: DEBUG oslo_concurrency.lockutils [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1797.034958] env[67820]: DEBUG nova.compute.manager [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] No waiting events found dispatching network-vif-plugged-46e935b7-31a5-4169-9482-ce1c49d0ce26 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1797.035137] env[67820]: WARNING nova.compute.manager [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Received unexpected event network-vif-plugged-46e935b7-31a5-4169-9482-ce1c49d0ce26 for instance with vm_state building and task_state spawning. [ 1797.035297] env[67820]: DEBUG nova.compute.manager [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Received event network-changed-46e935b7-31a5-4169-9482-ce1c49d0ce26 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1797.035477] env[67820]: DEBUG nova.compute.manager [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Refreshing instance network info cache due to event network-changed-46e935b7-31a5-4169-9482-ce1c49d0ce26. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1797.035673] env[67820]: DEBUG oslo_concurrency.lockutils [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] Acquiring lock "refresh_cache-2965c630-07c6-4e08-a5ab-4996d4c72b82" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1797.035809] env[67820]: DEBUG oslo_concurrency.lockutils [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] Acquired lock "refresh_cache-2965c630-07c6-4e08-a5ab-4996d4c72b82" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1797.036047] env[67820]: DEBUG nova.network.neutron [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Refreshing network info cache for port 46e935b7-31a5-4169-9482-ce1c49d0ce26 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1797.040400] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467474, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.290603] env[67820]: DEBUG nova.network.neutron [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Updated VIF entry in instance network info cache for port 46e935b7-31a5-4169-9482-ce1c49d0ce26. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1797.290603] env[67820]: DEBUG nova.network.neutron [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Updating instance_info_cache with network_info: [{"id": "46e935b7-31a5-4169-9482-ce1c49d0ce26", "address": "fa:16:3e:f7:3e:ff", "network": {"id": "764c9f14-f47a-4e5c-8563-6c7eb732d84b", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1064641058-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2bbd98fc847f441ab80f175642f3d12a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap46e935b7-31", "ovs_interfaceid": "46e935b7-31a5-4169-9482-ce1c49d0ce26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1797.301779] env[67820]: DEBUG oslo_concurrency.lockutils [req-e41520e1-a40f-4f9d-82fa-8b6f1441d484 req-88cc2984-1575-494f-8477-34f3fa32873f service nova] Releasing lock "refresh_cache-2965c630-07c6-4e08-a5ab-4996d4c72b82" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1797.538534] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467474, 'name': CreateVM_Task, 'duration_secs': 0.329147} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1797.538703] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1797.539365] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1797.539535] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1797.539846] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1797.540152] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3fbde316-bdb9-4def-84ff-7ec63601b746 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1797.544523] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 1797.544523] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52c1b359-2b16-8b04-b27d-587adfd5ee6f" [ 1797.544523] env[67820]: _type = "Task" [ 1797.544523] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1797.552065] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52c1b359-2b16-8b04-b27d-587adfd5ee6f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1797.621472] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1797.621806] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1798.054708] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1798.055842] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1798.055842] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1798.621642] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1798.622012] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1798.622012] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1798.650067] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.650219] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.650355] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.650499] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.650625] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.650746] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.650863] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.650979] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.651114] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.651231] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1798.651496] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1798.869399] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "eb759eb8-e670-4b9b-a0e0-865bdd53a208" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1798.869722] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "eb759eb8-e670-4b9b-a0e0-865bdd53a208" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1799.621144] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1799.621412] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1799.634107] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.634457] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1799.634546] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1799.634757] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1799.635897] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1caf974c-226c-47e5-8e8e-54d6d64e8707 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.644712] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2897ad99-1ac0-4027-9096-5ac6fbc2bc92 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.658522] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e0bc2e9-2a52-4faf-907d-cd95c5b2cada {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.664545] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d495990f-e20e-470a-a986-08d36a4fc5ba {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.694034] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1799.694211] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.694369] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1799.770244] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.770387] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.770439] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.770705] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.770705] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.770845] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.770890] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.771009] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.771133] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.771247] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.784431] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 813af34e-49cc-40a9-a0d0-388a84fde493 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.795454] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.806188] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5c10efe6-fd80-430e-b647-3eaf2213af78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.816385] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.825888] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.826166] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1799.826319] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1799.999684] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ede56c8-7117-4433-b46e-75d1ad9a6dfc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.007584] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-182bac62-2369-4c15-a8af-8d106f9ae4a2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.036676] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-638e482c-7848-4a63-a718-5a94d23c41a0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.044162] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dee294e-29d1-408d-9924-e7fac4f0109c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1800.057248] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1800.067415] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1800.082561] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1800.082759] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.388s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.083876] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1809.012280] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "2965c630-07c6-4e08-a5ab-4996d4c72b82" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1809.076323] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1832.914869] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1832.915184] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1843.289596] env[67820]: WARNING oslo_vmware.rw_handles [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1843.289596] env[67820]: ERROR oslo_vmware.rw_handles [ 1843.290251] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1843.292048] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1843.292302] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Copying Virtual Disk [datastore1] vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/01fe684a-ab35-47d7-8a1a-31391ac7ea88/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1843.292590] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7305a219-7aeb-41c6-b4c9-7c53b3765419 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.300132] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for the task: (returnval){ [ 1843.300132] env[67820]: value = "task-3467475" [ 1843.300132] env[67820]: _type = "Task" [ 1843.300132] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1843.307894] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Task: {'id': task-3467475, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1843.810816] env[67820]: DEBUG oslo_vmware.exceptions [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1843.811121] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1843.811684] env[67820]: ERROR nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1843.811684] env[67820]: Faults: ['InvalidArgument'] [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Traceback (most recent call last): [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] yield resources [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self.driver.spawn(context, instance, image_meta, [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self._fetch_image_if_missing(context, vi) [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] image_cache(vi, tmp_image_ds_loc) [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] vm_util.copy_virtual_disk( [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] session._wait_for_task(vmdk_copy_task) [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] return self.wait_for_task(task_ref) [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] return evt.wait() [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] result = hub.switch() [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] return self.greenlet.switch() [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self.f(*self.args, **self.kw) [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] raise exceptions.translate_fault(task_info.error) [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Faults: ['InvalidArgument'] [ 1843.811684] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] [ 1843.812844] env[67820]: INFO nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Terminating instance [ 1843.813602] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1843.813811] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1843.814429] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cb01e9a7-9544-4c00-b1a2-996a41f477cc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.816723] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1843.816906] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1843.817702] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06684ef2-ff12-4ddd-ba3f-9e8bba8f7463 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.824321] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1843.824546] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c9e0758d-a9bc-46d4-8ff3-52cf317559f7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.826964] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1843.827150] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1843.827795] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d52f779-0b57-4dbc-9f2f-c46c71906142 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.832330] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1843.832330] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52ad75c1-8e48-f262-5de4-5b8cdcb40c9a" [ 1843.832330] env[67820]: _type = "Task" [ 1843.832330] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1843.839631] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52ad75c1-8e48-f262-5de4-5b8cdcb40c9a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1843.893820] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1843.894073] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1843.894267] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Deleting the datastore file [datastore1] 11320faf-fa01-49c8-9d96-af9a4f6c5095 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1843.894554] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4dc8ed33-61fc-4fbb-ad14-85bedac6e9dc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1843.900463] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for the task: (returnval){ [ 1843.900463] env[67820]: value = "task-3467477" [ 1843.900463] env[67820]: _type = "Task" [ 1843.900463] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1843.909557] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Task: {'id': task-3467477, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1844.342951] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1844.343273] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating directory with path [datastore1] vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1844.343498] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ff79155e-206b-49d2-a04d-e2b99547cdd3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.355459] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Created directory with path [datastore1] vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1844.355655] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Fetch image to [datastore1] vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1844.355822] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1844.357077] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e875237-11df-4568-bcd1-42ad9e1f5c07 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.364622] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e90bf60-2a49-4661-b5da-abb0e3636492 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.376314] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8be088c8-bd3c-4c2d-a52b-be91a1571a9b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.409578] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e3c7641-a4e7-4f03-982f-e267d418f6d2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.416402] env[67820]: DEBUG oslo_vmware.api [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Task: {'id': task-3467477, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.088848} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1844.417809] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1844.417994] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1844.418182] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1844.418353] env[67820]: INFO nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1844.420107] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-87fdda4c-11f8-4484-94fc-7307a0ba30f1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.421914] env[67820]: DEBUG nova.compute.claims [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1844.422110] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1844.422322] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1844.442111] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1844.583088] env[67820]: DEBUG oslo_vmware.rw_handles [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1844.643707] env[67820]: DEBUG oslo_vmware.rw_handles [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1844.643894] env[67820]: DEBUG oslo_vmware.rw_handles [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1844.704986] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-019db2db-6428-4f11-8231-252cf37f2f53 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.713452] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd15a171-0046-401e-b08a-42287e0372a9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.748706] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78e9d055-e563-4007-b7bc-882602f8a045 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.757654] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a215d49-2f37-4383-b2a7-0805a99f8092 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1844.771337] env[67820]: DEBUG nova.compute.provider_tree [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1844.780636] env[67820]: DEBUG nova.scheduler.client.report [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1844.795088] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.373s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1844.795660] env[67820]: ERROR nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1844.795660] env[67820]: Faults: ['InvalidArgument'] [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Traceback (most recent call last): [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self.driver.spawn(context, instance, image_meta, [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self._fetch_image_if_missing(context, vi) [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] image_cache(vi, tmp_image_ds_loc) [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] vm_util.copy_virtual_disk( [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] session._wait_for_task(vmdk_copy_task) [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] return self.wait_for_task(task_ref) [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] return evt.wait() [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] result = hub.switch() [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] return self.greenlet.switch() [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] self.f(*self.args, **self.kw) [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] raise exceptions.translate_fault(task_info.error) [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Faults: ['InvalidArgument'] [ 1844.795660] env[67820]: ERROR nova.compute.manager [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] [ 1844.796626] env[67820]: DEBUG nova.compute.utils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1844.798055] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Build of instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 was re-scheduled: A specified parameter was not correct: fileType [ 1844.798055] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1844.798443] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1844.798617] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1844.798782] env[67820]: DEBUG nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1844.798967] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1845.134457] env[67820]: DEBUG nova.network.neutron [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1845.145562] env[67820]: INFO nova.compute.manager [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Took 0.35 seconds to deallocate network for instance. [ 1845.234697] env[67820]: INFO nova.scheduler.client.report [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Deleted allocations for instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 [ 1845.255292] env[67820]: DEBUG oslo_concurrency.lockutils [None req-52cc7c97-a4e5-4eb1-bc3e-a35eb8abb6ae tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 603.215s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.256839] env[67820]: DEBUG oslo_concurrency.lockutils [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 406.788s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.256839] env[67820]: DEBUG oslo_concurrency.lockutils [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "11320faf-fa01-49c8-9d96-af9a4f6c5095-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.256990] env[67820]: DEBUG oslo_concurrency.lockutils [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.257066] env[67820]: DEBUG oslo_concurrency.lockutils [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.259047] env[67820]: INFO nova.compute.manager [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Terminating instance [ 1845.261364] env[67820]: DEBUG nova.compute.manager [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1845.261537] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1845.262016] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1e042ded-ba65-4193-ac31-d35fb81888a3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.271288] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a71c4a66-e52c-411e-8346-f9e6fed7bc83 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.282447] env[67820]: DEBUG nova.compute.manager [None req-8ea24370-27bd-47d7-aeca-5473b5a8b7a9 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 813af34e-49cc-40a9-a0d0-388a84fde493] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1845.303721] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 11320faf-fa01-49c8-9d96-af9a4f6c5095 could not be found. [ 1845.303945] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1845.304145] env[67820]: INFO nova.compute.manager [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1845.304403] env[67820]: DEBUG oslo.service.loopingcall [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1845.304637] env[67820]: DEBUG nova.compute.manager [-] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1845.304736] env[67820]: DEBUG nova.network.neutron [-] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1845.309325] env[67820]: DEBUG nova.compute.manager [None req-8ea24370-27bd-47d7-aeca-5473b5a8b7a9 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 813af34e-49cc-40a9-a0d0-388a84fde493] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1845.329522] env[67820]: DEBUG nova.network.neutron [-] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1845.332995] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ea24370-27bd-47d7-aeca-5473b5a8b7a9 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "813af34e-49cc-40a9-a0d0-388a84fde493" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.452s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.337369] env[67820]: INFO nova.compute.manager [-] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] Took 0.03 seconds to deallocate network for instance. [ 1845.342719] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1845.396493] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.396770] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.398304] env[67820]: INFO nova.compute.claims [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1845.431373] env[67820]: DEBUG oslo_concurrency.lockutils [None req-41976f98-5713-4468-8aa9-77a87478034d tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.432203] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 94.821s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.432389] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 11320faf-fa01-49c8-9d96-af9a4f6c5095] During sync_power_state the instance has a pending task (deleting). Skip. [ 1845.432560] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "11320faf-fa01-49c8-9d96-af9a4f6c5095" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.603320] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae5b67f7-6296-45f2-b142-e8e22efd9f87 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.611087] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c949c2d3-e43d-4d13-b45f-110e5c778536 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.640610] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-934b36c5-74aa-4d1d-85f8-e7724990abf2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.647671] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01b3b76-fb35-4941-80ee-4007a5bba114 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.660924] env[67820]: DEBUG nova.compute.provider_tree [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1845.668810] env[67820]: DEBUG nova.scheduler.client.report [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1845.682861] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.683352] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1845.718282] env[67820]: DEBUG nova.compute.utils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1845.719629] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1845.719804] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1845.727931] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1845.782470] env[67820]: DEBUG nova.policy [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5af508bde8847228f40888783142106', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ccf8fe5def284576a660bd7505892bde', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1845.792745] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1845.816600] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1845.816839] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1845.816993] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1845.817189] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1845.817357] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1845.817510] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1845.817713] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1845.817871] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1845.818046] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1845.818213] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1845.818383] env[67820]: DEBUG nova.virt.hardware [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1845.819246] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25944083-393f-4b6b-a9ea-146048d31435 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.826840] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d899ffb0-1320-458b-a22d-3bc8a1bbf0db {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1846.189775] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Successfully created port: 085569b5-5737-4700-b302-cebf83120eb4 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1846.796703] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Successfully updated port: 085569b5-5737-4700-b302-cebf83120eb4 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1846.812483] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "refresh_cache-bcb239dd-e793-43be-9f94-e53eb50e2f49" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1846.812483] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquired lock "refresh_cache-bcb239dd-e793-43be-9f94-e53eb50e2f49" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1846.812483] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1846.869858] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1847.054867] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Updating instance_info_cache with network_info: [{"id": "085569b5-5737-4700-b302-cebf83120eb4", "address": "fa:16:3e:33:08:53", "network": {"id": "ee19f382-35ef-4070-a391-5060f8bc0bd5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1134714078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ccf8fe5def284576a660bd7505892bde", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4aa1eda7-48b9-4fa2-af0b-94c718313af2", "external-id": "nsx-vlan-transportzone-502", "segmentation_id": 502, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap085569b5-57", "ovs_interfaceid": "085569b5-5737-4700-b302-cebf83120eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1847.067635] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Releasing lock "refresh_cache-bcb239dd-e793-43be-9f94-e53eb50e2f49" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1847.067906] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Instance network_info: |[{"id": "085569b5-5737-4700-b302-cebf83120eb4", "address": "fa:16:3e:33:08:53", "network": {"id": "ee19f382-35ef-4070-a391-5060f8bc0bd5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1134714078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ccf8fe5def284576a660bd7505892bde", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4aa1eda7-48b9-4fa2-af0b-94c718313af2", "external-id": "nsx-vlan-transportzone-502", "segmentation_id": 502, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap085569b5-57", "ovs_interfaceid": "085569b5-5737-4700-b302-cebf83120eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1847.068314] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:33:08:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4aa1eda7-48b9-4fa2-af0b-94c718313af2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '085569b5-5737-4700-b302-cebf83120eb4', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1847.083291] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Creating folder: Project (ccf8fe5def284576a660bd7505892bde). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1847.083871] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-66133736-f86b-48b8-af85-ae7455647fa4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.095909] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Created folder: Project (ccf8fe5def284576a660bd7505892bde) in parent group-v692668. [ 1847.096108] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Creating folder: Instances. Parent ref: group-v692770. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1847.096366] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6ab32351-3180-4dc1-b199-517f9e515278 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.104735] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Created folder: Instances in parent group-v692770. [ 1847.104952] env[67820]: DEBUG oslo.service.loopingcall [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1847.105145] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1847.105333] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-79786538-ed64-4266-973c-9a821e7a12f2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.123953] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1847.123953] env[67820]: value = "task-3467480" [ 1847.123953] env[67820]: _type = "Task" [ 1847.123953] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1847.131709] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467480, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1847.191033] env[67820]: DEBUG nova.compute.manager [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Received event network-vif-plugged-085569b5-5737-4700-b302-cebf83120eb4 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1847.191267] env[67820]: DEBUG oslo_concurrency.lockutils [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] Acquiring lock "bcb239dd-e793-43be-9f94-e53eb50e2f49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1847.191480] env[67820]: DEBUG oslo_concurrency.lockutils [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1847.191651] env[67820]: DEBUG oslo_concurrency.lockutils [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.191814] env[67820]: DEBUG nova.compute.manager [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] No waiting events found dispatching network-vif-plugged-085569b5-5737-4700-b302-cebf83120eb4 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1847.191978] env[67820]: WARNING nova.compute.manager [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Received unexpected event network-vif-plugged-085569b5-5737-4700-b302-cebf83120eb4 for instance with vm_state building and task_state spawning. [ 1847.194490] env[67820]: DEBUG nova.compute.manager [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Received event network-changed-085569b5-5737-4700-b302-cebf83120eb4 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1847.194690] env[67820]: DEBUG nova.compute.manager [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Refreshing instance network info cache due to event network-changed-085569b5-5737-4700-b302-cebf83120eb4. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1847.194892] env[67820]: DEBUG oslo_concurrency.lockutils [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] Acquiring lock "refresh_cache-bcb239dd-e793-43be-9f94-e53eb50e2f49" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1847.195059] env[67820]: DEBUG oslo_concurrency.lockutils [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] Acquired lock "refresh_cache-bcb239dd-e793-43be-9f94-e53eb50e2f49" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1847.195232] env[67820]: DEBUG nova.network.neutron [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Refreshing network info cache for port 085569b5-5737-4700-b302-cebf83120eb4 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1847.450681] env[67820]: DEBUG nova.network.neutron [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Updated VIF entry in instance network info cache for port 085569b5-5737-4700-b302-cebf83120eb4. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1847.451058] env[67820]: DEBUG nova.network.neutron [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Updating instance_info_cache with network_info: [{"id": "085569b5-5737-4700-b302-cebf83120eb4", "address": "fa:16:3e:33:08:53", "network": {"id": "ee19f382-35ef-4070-a391-5060f8bc0bd5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1134714078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ccf8fe5def284576a660bd7505892bde", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4aa1eda7-48b9-4fa2-af0b-94c718313af2", "external-id": "nsx-vlan-transportzone-502", "segmentation_id": 502, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap085569b5-57", "ovs_interfaceid": "085569b5-5737-4700-b302-cebf83120eb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1847.460654] env[67820]: DEBUG oslo_concurrency.lockutils [req-33b440d0-4fd8-421e-8575-7b591d4ac741 req-20f1180f-a4af-4474-a7a3-6f3426810091 service nova] Releasing lock "refresh_cache-bcb239dd-e793-43be-9f94-e53eb50e2f49" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1847.633868] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467480, 'name': CreateVM_Task, 'duration_secs': 0.351993} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1847.634035] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1847.640515] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1847.640683] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1847.640995] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1847.641243] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e5de26e4-bee0-4390-87ff-456c91c687fe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1847.645593] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for the task: (returnval){ [ 1847.645593] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5221ec1a-c15b-8e00-7c49-a66f9a159a05" [ 1847.645593] env[67820]: _type = "Task" [ 1847.645593] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1847.654111] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5221ec1a-c15b-8e00-7c49-a66f9a159a05, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1848.155720] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1848.156021] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1848.156218] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1850.616654] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.621156] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.621411] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.621556] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1856.172236] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "bcb239dd-e793-43be-9f94-e53eb50e2f49" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1858.616602] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1858.621333] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.621559] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1860.621589] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1860.621926] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1860.621926] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1860.644477] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.644645] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.644777] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.644901] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.645034] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.645160] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.645279] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.645396] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.645513] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.645628] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1860.645745] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1861.621606] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1861.621971] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1861.633622] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1861.633873] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1861.634055] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1861.634214] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1861.635355] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a6aaf6e-d1dd-422b-82f3-b948c1f5f1ac {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.644302] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bad4e994-7815-4d28-b81c-2edff114cdc3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.658214] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12d4bfab-84bb-4a3d-b590-329f357fc823 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.664393] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f11dcccc-c064-49be-930c-4546b9c5ce34 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.694409] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180952MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1861.694559] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1861.694754] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1861.766829] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e401a9ad-d6ed-4511-936c-4cf36d41281b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.766992] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767135] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767256] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767375] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767499] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767609] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767720] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767829] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.767938] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.779076] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 5c10efe6-fd80-430e-b647-3eaf2213af78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.790260] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.800266] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.811793] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.812059] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1861.812227] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1861.966673] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d11c12ef-34b5-43fc-9ee5-2bc3be99336b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.974196] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28b83dc2-0174-4998-89b6-981d6c20a2a0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.003787] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a53256a4-b580-4e4e-90fe-8bbffa60c63b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.010902] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cef7804f-84d6-4f62-9788-156d1ab504e7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.025272] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1862.034020] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1862.051208] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1862.051383] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1864.051575] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1892.733443] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "1068e5cc-2514-4e07-aeee-e7e64c95a979" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1892.733721] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "1068e5cc-2514-4e07-aeee-e7e64c95a979" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1893.949637] env[67820]: WARNING oslo_vmware.rw_handles [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1893.949637] env[67820]: ERROR oslo_vmware.rw_handles [ 1893.950270] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1893.952235] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1893.952466] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Copying Virtual Disk [datastore1] vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/b1829490-9655-4a4c-a9d3-b02ce96e0030/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1893.952789] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9d560504-16fd-4fe8-a52e-df970b76141b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1893.961721] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1893.961721] env[67820]: value = "task-3467481" [ 1893.961721] env[67820]: _type = "Task" [ 1893.961721] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1893.969802] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': task-3467481, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1894.449442] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f6e8c092-e11d-4603-8251-380e9e97564b tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "75fc136a-9045-4b38-bb6c-37953cf8f778" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1894.449688] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f6e8c092-e11d-4603-8251-380e9e97564b tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "75fc136a-9045-4b38-bb6c-37953cf8f778" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1894.472239] env[67820]: DEBUG oslo_vmware.exceptions [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1894.472600] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1894.473230] env[67820]: ERROR nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1894.473230] env[67820]: Faults: ['InvalidArgument'] [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Traceback (most recent call last): [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] yield resources [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self.driver.spawn(context, instance, image_meta, [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self._fetch_image_if_missing(context, vi) [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] image_cache(vi, tmp_image_ds_loc) [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] vm_util.copy_virtual_disk( [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] session._wait_for_task(vmdk_copy_task) [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] return self.wait_for_task(task_ref) [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] return evt.wait() [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] result = hub.switch() [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] return self.greenlet.switch() [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self.f(*self.args, **self.kw) [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] raise exceptions.translate_fault(task_info.error) [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Faults: ['InvalidArgument'] [ 1894.473230] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] [ 1894.474333] env[67820]: INFO nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Terminating instance [ 1894.475147] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1894.475352] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1894.475596] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e8ee2723-193f-4827-a1f1-57245e0b0be7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1894.477972] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1894.478183] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1894.478906] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3ef32c8-2d3d-4fa4-a2eb-541732444185 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1894.485206] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1894.485439] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0ec7860a-8df1-4e42-8d37-94212b21b763 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1894.487542] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1894.487719] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1894.488651] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-847dc42b-8c22-4046-b5e9-597284a41527 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1894.493139] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1894.493139] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52159409-d039-9d6a-9afc-73a3841a1119" [ 1894.493139] env[67820]: _type = "Task" [ 1894.493139] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1894.500402] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52159409-d039-9d6a-9afc-73a3841a1119, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1894.550370] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1894.550579] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1894.550819] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Deleting the datastore file [datastore1] e401a9ad-d6ed-4511-936c-4cf36d41281b {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1894.551162] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8847e9e9-4c23-49f0-9808-31a2e6557961 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1894.557063] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for the task: (returnval){ [ 1894.557063] env[67820]: value = "task-3467483" [ 1894.557063] env[67820]: _type = "Task" [ 1894.557063] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1894.564607] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': task-3467483, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1895.003220] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1895.003504] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating directory with path [datastore1] vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1895.003710] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-268ae7db-ebf2-4499-a39b-54913e06a9aa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.014947] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Created directory with path [datastore1] vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1895.015152] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Fetch image to [datastore1] vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1895.015320] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1895.016068] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7af7096-2cdf-4667-88f5-491cc4cb9412 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.022730] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c68499cf-5283-42bb-ab9c-c0b90863bd43 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.033267] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99d69cce-5001-4715-87d8-8310c3f54209 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.067134] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1a67422-24ec-45d6-8628-3e9fe17bbc46 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.074119] env[67820]: DEBUG oslo_vmware.api [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Task: {'id': task-3467483, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075292} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1895.075554] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1895.075744] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1895.075919] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1895.076120] env[67820]: INFO nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1895.077902] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a5a498ac-2935-4789-aa8c-ad8fa97e26d9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.079814] env[67820]: DEBUG nova.compute.claims [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1895.079993] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1895.080338] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1895.102144] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1895.154167] env[67820]: DEBUG oslo_vmware.rw_handles [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1895.213946] env[67820]: DEBUG oslo_vmware.rw_handles [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1895.213946] env[67820]: DEBUG oslo_vmware.rw_handles [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1895.340399] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-767a16c8-fc25-4cec-84bd-b59dc8b6df65 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.348303] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7649cc1b-835c-4124-a06c-be2aad7f2b61 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.379586] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4da1ff23-886c-497d-821b-e23fc8f730ff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.386688] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-106b3134-657d-42ee-870a-7419ded5234d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.399523] env[67820]: DEBUG nova.compute.provider_tree [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1895.408317] env[67820]: DEBUG nova.scheduler.client.report [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1895.421349] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.341s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1895.421851] env[67820]: ERROR nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1895.421851] env[67820]: Faults: ['InvalidArgument'] [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Traceback (most recent call last): [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self.driver.spawn(context, instance, image_meta, [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self._fetch_image_if_missing(context, vi) [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] image_cache(vi, tmp_image_ds_loc) [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] vm_util.copy_virtual_disk( [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] session._wait_for_task(vmdk_copy_task) [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] return self.wait_for_task(task_ref) [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] return evt.wait() [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] result = hub.switch() [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] return self.greenlet.switch() [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] self.f(*self.args, **self.kw) [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] raise exceptions.translate_fault(task_info.error) [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Faults: ['InvalidArgument'] [ 1895.421851] env[67820]: ERROR nova.compute.manager [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] [ 1895.422832] env[67820]: DEBUG nova.compute.utils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1895.423846] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Build of instance e401a9ad-d6ed-4511-936c-4cf36d41281b was re-scheduled: A specified parameter was not correct: fileType [ 1895.423846] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1895.424239] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1895.424408] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1895.424577] env[67820]: DEBUG nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1895.424740] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1895.774563] env[67820]: DEBUG nova.network.neutron [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1895.787376] env[67820]: INFO nova.compute.manager [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Took 0.36 seconds to deallocate network for instance. [ 1895.880375] env[67820]: INFO nova.scheduler.client.report [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Deleted allocations for instance e401a9ad-d6ed-4511-936c-4cf36d41281b [ 1895.901942] env[67820]: DEBUG oslo_concurrency.lockutils [None req-8ba3f6f7-c78f-492b-9c4e-b3664055f59b tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.060s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1895.903234] env[67820]: DEBUG oslo_concurrency.lockutils [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.293s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1895.903468] env[67820]: DEBUG oslo_concurrency.lockutils [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Acquiring lock "e401a9ad-d6ed-4511-936c-4cf36d41281b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1895.903676] env[67820]: DEBUG oslo_concurrency.lockutils [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1895.904705] env[67820]: DEBUG oslo_concurrency.lockutils [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1895.906039] env[67820]: INFO nova.compute.manager [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Terminating instance [ 1895.908415] env[67820]: DEBUG nova.compute.manager [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1895.908611] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1895.909085] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cbf33e55-fb5f-4ed2-8530-8b8a2610a678 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.912533] env[67820]: DEBUG nova.compute.manager [None req-ddfd77fc-3d33-4864-bf88-1f6484518ca1 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 5c10efe6-fd80-430e-b647-3eaf2213af78] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1895.919309] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14aeffb0-58e9-4235-aa4f-0ea7c6583861 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1895.936257] env[67820]: DEBUG nova.compute.manager [None req-ddfd77fc-3d33-4864-bf88-1f6484518ca1 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 5c10efe6-fd80-430e-b647-3eaf2213af78] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1895.949553] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e401a9ad-d6ed-4511-936c-4cf36d41281b could not be found. [ 1895.949788] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1895.949974] env[67820]: INFO nova.compute.manager [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1895.950240] env[67820]: DEBUG oslo.service.loopingcall [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1895.952189] env[67820]: DEBUG nova.compute.manager [-] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1895.952292] env[67820]: DEBUG nova.network.neutron [-] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1895.962324] env[67820]: DEBUG oslo_concurrency.lockutils [None req-ddfd77fc-3d33-4864-bf88-1f6484518ca1 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "5c10efe6-fd80-430e-b647-3eaf2213af78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.771s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1895.970958] env[67820]: DEBUG nova.compute.manager [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1895.985936] env[67820]: DEBUG nova.network.neutron [-] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1896.008387] env[67820]: INFO nova.compute.manager [-] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] Took 0.06 seconds to deallocate network for instance. [ 1896.031809] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1896.032078] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.033499] env[67820]: INFO nova.compute.claims [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1896.095817] env[67820]: DEBUG oslo_concurrency.lockutils [None req-99085e80-dd75-4905-b693-6291df97afd6 tempest-VolumesAdminNegativeTest-1846919607 tempest-VolumesAdminNegativeTest-1846919607-project-member] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.193s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.096715] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 145.485s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1896.096911] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e401a9ad-d6ed-4511-936c-4cf36d41281b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1896.097093] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "e401a9ad-d6ed-4511-936c-4cf36d41281b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.246856] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9b4b4c1-ae66-4aea-b3b3-cd04d313978b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.254123] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f84bbf9-a3e1-4328-a21b-0162b6b9c757 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.283508] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e4ddeb6-d7b7-4fe1-a16b-5f046c3b8bf5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.290414] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10e4a2f4-be7f-4e44-b41d-029759a7b0e9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.303227] env[67820]: DEBUG nova.compute.provider_tree [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1896.312501] env[67820]: DEBUG nova.scheduler.client.report [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1896.329031] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1896.329482] env[67820]: DEBUG nova.compute.manager [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1896.362446] env[67820]: DEBUG nova.compute.utils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1896.366044] env[67820]: DEBUG nova.compute.manager [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1896.366231] env[67820]: DEBUG nova.network.neutron [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1896.379023] env[67820]: DEBUG nova.compute.manager [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1896.439103] env[67820]: DEBUG nova.policy [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7848a5ecde643e1b65961cffbaa67c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4cd598f68a7142a68affb12d872648dd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1896.448082] env[67820]: DEBUG nova.compute.manager [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1896.478438] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1896.478674] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1896.478829] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1896.479020] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1896.479167] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1896.479312] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1896.479514] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1896.479668] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1896.479837] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1896.479997] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1896.480193] env[67820]: DEBUG nova.virt.hardware [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1896.481053] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63c07ee9-1eb8-4337-89bd-b2bcd6a95a3f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.489643] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8723456-4ee3-4a2c-8397-41b7790cb543 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1896.730817] env[67820]: DEBUG nova.network.neutron [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Successfully created port: bfe7b06b-e668-4fd1-b6ad-9d24b8743f89 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1897.379105] env[67820]: DEBUG nova.network.neutron [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Successfully updated port: bfe7b06b-e668-4fd1-b6ad-9d24b8743f89 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1897.394929] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquiring lock "refresh_cache-fffda39c-1960-49f9-a26b-6b87e2c3c53e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1897.395181] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquired lock "refresh_cache-fffda39c-1960-49f9-a26b-6b87e2c3c53e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1897.395259] env[67820]: DEBUG nova.network.neutron [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1897.436980] env[67820]: DEBUG nova.network.neutron [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1897.811921] env[67820]: DEBUG nova.compute.manager [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Received event network-vif-plugged-bfe7b06b-e668-4fd1-b6ad-9d24b8743f89 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1897.812169] env[67820]: DEBUG oslo_concurrency.lockutils [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] Acquiring lock "fffda39c-1960-49f9-a26b-6b87e2c3c53e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1897.812378] env[67820]: DEBUG oslo_concurrency.lockutils [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] Lock "fffda39c-1960-49f9-a26b-6b87e2c3c53e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1897.812544] env[67820]: DEBUG oslo_concurrency.lockutils [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] Lock "fffda39c-1960-49f9-a26b-6b87e2c3c53e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1897.812710] env[67820]: DEBUG nova.compute.manager [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] No waiting events found dispatching network-vif-plugged-bfe7b06b-e668-4fd1-b6ad-9d24b8743f89 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1897.812873] env[67820]: WARNING nova.compute.manager [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Received unexpected event network-vif-plugged-bfe7b06b-e668-4fd1-b6ad-9d24b8743f89 for instance with vm_state building and task_state spawning. [ 1897.813041] env[67820]: DEBUG nova.compute.manager [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Received event network-changed-bfe7b06b-e668-4fd1-b6ad-9d24b8743f89 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1897.813197] env[67820]: DEBUG nova.compute.manager [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Refreshing instance network info cache due to event network-changed-bfe7b06b-e668-4fd1-b6ad-9d24b8743f89. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1897.813359] env[67820]: DEBUG oslo_concurrency.lockutils [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] Acquiring lock "refresh_cache-fffda39c-1960-49f9-a26b-6b87e2c3c53e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1897.858064] env[67820]: DEBUG nova.network.neutron [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Updating instance_info_cache with network_info: [{"id": "bfe7b06b-e668-4fd1-b6ad-9d24b8743f89", "address": "fa:16:3e:9d:ef:ed", "network": {"id": "659d8ffb-6979-4901-b420-61cb12e735bc", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1464590530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4cd598f68a7142a68affb12d872648dd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cc30a16-f070-421c-964e-50c9aa32f17a", "external-id": "nsx-vlan-transportzone-424", "segmentation_id": 424, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfe7b06b-e6", "ovs_interfaceid": "bfe7b06b-e668-4fd1-b6ad-9d24b8743f89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1897.871052] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Releasing lock "refresh_cache-fffda39c-1960-49f9-a26b-6b87e2c3c53e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1897.871344] env[67820]: DEBUG nova.compute.manager [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Instance network_info: |[{"id": "bfe7b06b-e668-4fd1-b6ad-9d24b8743f89", "address": "fa:16:3e:9d:ef:ed", "network": {"id": "659d8ffb-6979-4901-b420-61cb12e735bc", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1464590530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4cd598f68a7142a68affb12d872648dd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cc30a16-f070-421c-964e-50c9aa32f17a", "external-id": "nsx-vlan-transportzone-424", "segmentation_id": 424, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfe7b06b-e6", "ovs_interfaceid": "bfe7b06b-e668-4fd1-b6ad-9d24b8743f89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1897.871649] env[67820]: DEBUG oslo_concurrency.lockutils [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] Acquired lock "refresh_cache-fffda39c-1960-49f9-a26b-6b87e2c3c53e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1897.871863] env[67820]: DEBUG nova.network.neutron [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Refreshing network info cache for port bfe7b06b-e668-4fd1-b6ad-9d24b8743f89 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1897.873029] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9d:ef:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cc30a16-f070-421c-964e-50c9aa32f17a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bfe7b06b-e668-4fd1-b6ad-9d24b8743f89', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1897.880493] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Creating folder: Project (4cd598f68a7142a68affb12d872648dd). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1897.881579] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-99a4539e-5594-433f-974e-3abd898af11e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.894263] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Created folder: Project (4cd598f68a7142a68affb12d872648dd) in parent group-v692668. [ 1897.894457] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Creating folder: Instances. Parent ref: group-v692773. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1897.894728] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4797d353-4cd0-48f7-8add-013a8b287217 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.904133] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Created folder: Instances in parent group-v692773. [ 1897.904355] env[67820]: DEBUG oslo.service.loopingcall [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1897.904536] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1897.904758] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-63431cf4-8b8b-4614-820e-1ae853a93832 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1897.924904] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1897.924904] env[67820]: value = "task-3467486" [ 1897.924904] env[67820]: _type = "Task" [ 1897.924904] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1897.933981] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467486, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1898.166815] env[67820]: DEBUG nova.network.neutron [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Updated VIF entry in instance network info cache for port bfe7b06b-e668-4fd1-b6ad-9d24b8743f89. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1898.167282] env[67820]: DEBUG nova.network.neutron [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Updating instance_info_cache with network_info: [{"id": "bfe7b06b-e668-4fd1-b6ad-9d24b8743f89", "address": "fa:16:3e:9d:ef:ed", "network": {"id": "659d8ffb-6979-4901-b420-61cb12e735bc", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1464590530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4cd598f68a7142a68affb12d872648dd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cc30a16-f070-421c-964e-50c9aa32f17a", "external-id": "nsx-vlan-transportzone-424", "segmentation_id": 424, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfe7b06b-e6", "ovs_interfaceid": "bfe7b06b-e668-4fd1-b6ad-9d24b8743f89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1898.177854] env[67820]: DEBUG oslo_concurrency.lockutils [req-6a5ab0c6-ed17-4a04-8c09-561faa22c052 req-7f273e79-9ae0-4bfb-a649-702c645d2f0a service nova] Releasing lock "refresh_cache-fffda39c-1960-49f9-a26b-6b87e2c3c53e" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1898.435144] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467486, 'name': CreateVM_Task, 'duration_secs': 0.272633} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1898.435446] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1898.436060] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1898.436238] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1898.436553] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1898.437148] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4ac4bcfc-5f6a-42e6-9453-f76434df68c1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1898.441524] env[67820]: DEBUG oslo_vmware.api [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Waiting for the task: (returnval){ [ 1898.441524] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52247fc8-773b-9619-0809-752d98b2e10e" [ 1898.441524] env[67820]: _type = "Task" [ 1898.441524] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1898.449105] env[67820]: DEBUG oslo_vmware.api [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52247fc8-773b-9619-0809-752d98b2e10e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1898.953789] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1898.954067] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1898.954287] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1914.622737] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1915.621571] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1915.621749] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1919.621771] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1920.616556] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1920.622730] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1920.622730] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1920.622730] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1920.649389] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.649563] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.649696] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.649824] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.649949] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.650081] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.650215] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.650335] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.650454] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.650572] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1920.650690] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1921.621436] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1921.621693] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1921.632640] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1921.632941] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1921.633029] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1921.633187] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1921.634303] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53c20289-14d9-4f64-b535-b2514fe220d9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.643124] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e18e65b-87f4-4d5d-b359-2170b49e097a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.656919] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4979cf3e-6e62-4e93-a2f3-f7fca454d9fe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.663162] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38fef436-7a10-495b-9c5f-e541d9ded89c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.691592] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180926MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1921.691741] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1921.691932] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1921.757970] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.758202] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.758368] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.758540] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.758694] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.758843] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.758991] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.759173] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.759320] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.759460] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1921.769845] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1921.779300] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1921.788268] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1921.798414] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 75fc136a-9045-4b38-bb6c-37953cf8f778 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1921.798615] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1921.798800] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1921.959521] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40cc1588-6b4b-4a98-88ad-2936e8086151 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.966776] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af883875-07ec-450f-8b07-4b02f340c23e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.996287] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebd8722d-e0c7-4407-b9ac-e2c128a0141d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.002848] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2483453-efaf-46c7-8f1c-5e1da3dabc23 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.015447] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1922.024504] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1922.037247] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1922.037424] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.345s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1923.037782] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1925.622567] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1943.948368] env[67820]: WARNING oslo_vmware.rw_handles [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1943.948368] env[67820]: ERROR oslo_vmware.rw_handles [ 1943.949218] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1943.951098] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1943.951375] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Copying Virtual Disk [datastore1] vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/4e918085-eadc-47d4-845e-a9e849ff7933/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1943.951719] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-faf3efa1-68c7-46b4-9ca9-d0b52796d560 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1943.959607] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1943.959607] env[67820]: value = "task-3467487" [ 1943.959607] env[67820]: _type = "Task" [ 1943.959607] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1943.967385] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': task-3467487, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1944.472096] env[67820]: DEBUG oslo_vmware.exceptions [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1944.472399] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1944.472938] env[67820]: ERROR nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1944.472938] env[67820]: Faults: ['InvalidArgument'] [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Traceback (most recent call last): [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] yield resources [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self.driver.spawn(context, instance, image_meta, [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self._fetch_image_if_missing(context, vi) [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] image_cache(vi, tmp_image_ds_loc) [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] vm_util.copy_virtual_disk( [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] session._wait_for_task(vmdk_copy_task) [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] return self.wait_for_task(task_ref) [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] return evt.wait() [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] result = hub.switch() [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] return self.greenlet.switch() [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self.f(*self.args, **self.kw) [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] raise exceptions.translate_fault(task_info.error) [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Faults: ['InvalidArgument'] [ 1944.472938] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] [ 1944.474100] env[67820]: INFO nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Terminating instance [ 1944.474812] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1944.475037] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1944.475281] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-000aaa72-15ad-446f-8369-0b9d8c640043 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.477768] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1944.477976] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1944.478727] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6cf05cd-08f7-4ff1-8674-07786829cce6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.482657] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1944.482825] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1944.485137] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2848f116-98dc-4169-82e0-85c341c73911 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.487154] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1944.487363] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f9b48c1c-c90e-4a1e-9b4c-048714e5af17 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.491159] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Waiting for the task: (returnval){ [ 1944.491159] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52c73bd6-4933-f071-0cd3-d02eb60cbab5" [ 1944.491159] env[67820]: _type = "Task" [ 1944.491159] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1944.503826] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52c73bd6-4933-f071-0cd3-d02eb60cbab5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1944.558209] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1944.558414] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1944.558597] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Deleting the datastore file [datastore1] bb25ada4-c7fe-47a4-b784-b66f50c8e9eb {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1944.558873] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-406529db-d336-4fc0-810f-9626c48bbda6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.564550] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 1944.564550] env[67820]: value = "task-3467489" [ 1944.564550] env[67820]: _type = "Task" [ 1944.564550] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1944.571685] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': task-3467489, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.001934] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1945.002348] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Creating directory with path [datastore1] vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1945.002441] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-df8383cf-9daf-4eb5-afc0-cd8d0c8d2a57 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.013752] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Created directory with path [datastore1] vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1945.013931] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Fetch image to [datastore1] vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1945.014118] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1945.014859] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da3458c8-01b1-4d25-ad58-d269ef631f50 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.021507] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9a94e97-3dd5-4b60-bc62-f20c72b5aed8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.030598] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ca676c-8012-45d4-9b38-cca49826a0f3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.061688] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d5a7aab-e2ef-4848-bcd0-92e474f73259 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.071931] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5637788f-dd7d-4d65-a24a-6c99e090ec84 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.076715] env[67820]: DEBUG oslo_vmware.api [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': task-3467489, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077476} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1945.077382] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1945.077574] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1945.077746] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1945.077924] env[67820]: INFO nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1945.080197] env[67820]: DEBUG nova.compute.claims [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1945.080384] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1945.080621] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1945.099545] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1945.171491] env[67820]: DEBUG oslo_vmware.rw_handles [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1945.234177] env[67820]: DEBUG oslo_vmware.rw_handles [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1945.234405] env[67820]: DEBUG oslo_vmware.rw_handles [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1945.384969] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4445859e-726b-4bdb-a710-10b111248200 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.393177] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71457613-7711-466e-b0b9-17b581cac0ad {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.422952] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1107e0ec-d2cc-4df7-b930-f5afa436a8a6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.431294] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c059df8b-cfb6-44fa-83b4-deef304438f8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.444826] env[67820]: DEBUG nova.compute.provider_tree [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1945.454713] env[67820]: DEBUG nova.scheduler.client.report [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1945.469458] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.389s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1945.470286] env[67820]: ERROR nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1945.470286] env[67820]: Faults: ['InvalidArgument'] [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Traceback (most recent call last): [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self.driver.spawn(context, instance, image_meta, [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self._fetch_image_if_missing(context, vi) [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] image_cache(vi, tmp_image_ds_loc) [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] vm_util.copy_virtual_disk( [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] session._wait_for_task(vmdk_copy_task) [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] return self.wait_for_task(task_ref) [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] return evt.wait() [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] result = hub.switch() [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] return self.greenlet.switch() [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] self.f(*self.args, **self.kw) [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] raise exceptions.translate_fault(task_info.error) [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Faults: ['InvalidArgument'] [ 1945.470286] env[67820]: ERROR nova.compute.manager [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] [ 1945.471314] env[67820]: DEBUG nova.compute.utils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1945.472999] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Build of instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb was re-scheduled: A specified parameter was not correct: fileType [ 1945.472999] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1945.473392] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1945.473564] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1945.473731] env[67820]: DEBUG nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1945.473893] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1945.855158] env[67820]: DEBUG nova.network.neutron [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1945.865594] env[67820]: INFO nova.compute.manager [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Took 0.39 seconds to deallocate network for instance. [ 1945.971143] env[67820]: INFO nova.scheduler.client.report [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Deleted allocations for instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb [ 1945.993529] env[67820]: DEBUG oslo_concurrency.lockutils [None req-57c228f4-04d4-4472-a81e-74e15eb97c52 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 642.248s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1945.994825] env[67820]: DEBUG oslo_concurrency.lockutils [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 445.531s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1945.995058] env[67820]: DEBUG oslo_concurrency.lockutils [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1945.995285] env[67820]: DEBUG oslo_concurrency.lockutils [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1945.995454] env[67820]: DEBUG oslo_concurrency.lockutils [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1945.997563] env[67820]: INFO nova.compute.manager [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Terminating instance [ 1945.999537] env[67820]: DEBUG nova.compute.manager [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1945.999537] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1945.999925] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0afd8ac1-3053-4a0f-bc24-a29c73284893 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.009729] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa6b6cd1-1cb0-418d-b067-26ae0f07d088 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.021247] env[67820]: DEBUG nova.compute.manager [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1946.043602] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bb25ada4-c7fe-47a4-b784-b66f50c8e9eb could not be found. [ 1946.043815] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1946.044010] env[67820]: INFO nova.compute.manager [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1946.044278] env[67820]: DEBUG oslo.service.loopingcall [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1946.044671] env[67820]: DEBUG nova.compute.manager [-] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1946.044671] env[67820]: DEBUG nova.network.neutron [-] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1946.069293] env[67820]: DEBUG nova.network.neutron [-] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1946.071873] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1946.072132] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.073656] env[67820]: INFO nova.compute.claims [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1946.078423] env[67820]: INFO nova.compute.manager [-] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] Took 0.03 seconds to deallocate network for instance. [ 1946.178925] env[67820]: DEBUG oslo_concurrency.lockutils [None req-47d4615d-b85a-4b1b-a739-512e1ac0f2d9 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.180022] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 195.568s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1946.180140] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bb25ada4-c7fe-47a4-b784-b66f50c8e9eb] During sync_power_state the instance has a pending task (deleting). Skip. [ 1946.180251] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "bb25ada4-c7fe-47a4-b784-b66f50c8e9eb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.316846] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc017f96-79db-4c68-bc96-34860579b0be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.325458] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d61cb70c-61d0-4070-8dd6-a8e3799381c3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.358672] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f91bd3c-5dde-4fb3-b335-67c3190e1415 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.366320] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4835ebfb-dcf6-4335-8b14-5ae2644d459d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.381680] env[67820]: DEBUG nova.compute.provider_tree [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1946.391798] env[67820]: DEBUG nova.scheduler.client.report [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1946.408039] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.408039] env[67820]: DEBUG nova.compute.manager [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1946.444412] env[67820]: DEBUG nova.compute.utils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1946.446127] env[67820]: DEBUG nova.compute.manager [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1946.446311] env[67820]: DEBUG nova.network.neutron [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1946.456421] env[67820]: DEBUG nova.compute.manager [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1946.510983] env[67820]: DEBUG nova.policy [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'df43615850404e60b571c2ab5296519c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e17152dd1ce04f3dbcb729e8315f0006', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1946.550526] env[67820]: DEBUG nova.compute.manager [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1946.577140] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1946.577458] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1946.577642] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1946.577836] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1946.578174] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1946.578390] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1946.578613] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1946.578776] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1946.578943] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1946.579126] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1946.579295] env[67820]: DEBUG nova.virt.hardware [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1946.580162] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27d6430e-ad73-4b51-a2f3-922ae0dcbd3a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.588565] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7494abdc-2dab-4218-93a8-cfe1c3e097a2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1946.877867] env[67820]: DEBUG nova.network.neutron [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Successfully created port: 3d1cd454-87f7-4c9f-9d42-86aa46b2942c {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1947.481948] env[67820]: DEBUG nova.network.neutron [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Successfully updated port: 3d1cd454-87f7-4c9f-9d42-86aa46b2942c {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1947.496332] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "refresh_cache-eb759eb8-e670-4b9b-a0e0-865bdd53a208" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1947.497405] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired lock "refresh_cache-eb759eb8-e670-4b9b-a0e0-865bdd53a208" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1947.497699] env[67820]: DEBUG nova.network.neutron [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1947.542998] env[67820]: DEBUG nova.network.neutron [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1947.710753] env[67820]: DEBUG nova.network.neutron [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Updating instance_info_cache with network_info: [{"id": "3d1cd454-87f7-4c9f-9d42-86aa46b2942c", "address": "fa:16:3e:f1:9d:a3", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3d1cd454-87", "ovs_interfaceid": "3d1cd454-87f7-4c9f-9d42-86aa46b2942c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1947.723802] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Releasing lock "refresh_cache-eb759eb8-e670-4b9b-a0e0-865bdd53a208" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1947.724197] env[67820]: DEBUG nova.compute.manager [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Instance network_info: |[{"id": "3d1cd454-87f7-4c9f-9d42-86aa46b2942c", "address": "fa:16:3e:f1:9d:a3", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3d1cd454-87", "ovs_interfaceid": "3d1cd454-87f7-4c9f-9d42-86aa46b2942c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1947.724586] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f1:9d:a3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '418ddd3d-5f64-407e-8e0c-c8b81639bee9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3d1cd454-87f7-4c9f-9d42-86aa46b2942c', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1947.732336] env[67820]: DEBUG oslo.service.loopingcall [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1947.732780] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1947.733072] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-32949936-5878-4968-99c0-434cd20bdff7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1947.754051] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1947.754051] env[67820]: value = "task-3467490" [ 1947.754051] env[67820]: _type = "Task" [ 1947.754051] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1947.762558] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467490, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1947.941110] env[67820]: DEBUG nova.compute.manager [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Received event network-vif-plugged-3d1cd454-87f7-4c9f-9d42-86aa46b2942c {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1947.941349] env[67820]: DEBUG oslo_concurrency.lockutils [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] Acquiring lock "eb759eb8-e670-4b9b-a0e0-865bdd53a208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1947.941566] env[67820]: DEBUG oslo_concurrency.lockutils [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] Lock "eb759eb8-e670-4b9b-a0e0-865bdd53a208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1947.941735] env[67820]: DEBUG oslo_concurrency.lockutils [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] Lock "eb759eb8-e670-4b9b-a0e0-865bdd53a208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1947.941899] env[67820]: DEBUG nova.compute.manager [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] No waiting events found dispatching network-vif-plugged-3d1cd454-87f7-4c9f-9d42-86aa46b2942c {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1947.942075] env[67820]: WARNING nova.compute.manager [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Received unexpected event network-vif-plugged-3d1cd454-87f7-4c9f-9d42-86aa46b2942c for instance with vm_state building and task_state spawning. [ 1947.942237] env[67820]: DEBUG nova.compute.manager [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Received event network-changed-3d1cd454-87f7-4c9f-9d42-86aa46b2942c {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1947.942389] env[67820]: DEBUG nova.compute.manager [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Refreshing instance network info cache due to event network-changed-3d1cd454-87f7-4c9f-9d42-86aa46b2942c. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1947.942569] env[67820]: DEBUG oslo_concurrency.lockutils [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] Acquiring lock "refresh_cache-eb759eb8-e670-4b9b-a0e0-865bdd53a208" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1947.942702] env[67820]: DEBUG oslo_concurrency.lockutils [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] Acquired lock "refresh_cache-eb759eb8-e670-4b9b-a0e0-865bdd53a208" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1947.942856] env[67820]: DEBUG nova.network.neutron [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Refreshing network info cache for port 3d1cd454-87f7-4c9f-9d42-86aa46b2942c {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1948.263939] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467490, 'name': CreateVM_Task, 'duration_secs': 0.290896} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1948.266264] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1948.266869] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1948.267110] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1948.267536] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1948.268042] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9a45b903-7561-4f91-9e3c-5aa046bfa21f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1948.272386] env[67820]: DEBUG oslo_vmware.api [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for the task: (returnval){ [ 1948.272386] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52745658-017a-5e9b-a885-90b4dab9782b" [ 1948.272386] env[67820]: _type = "Task" [ 1948.272386] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1948.282534] env[67820]: DEBUG oslo_vmware.api [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52745658-017a-5e9b-a885-90b4dab9782b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1948.351877] env[67820]: DEBUG nova.network.neutron [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Updated VIF entry in instance network info cache for port 3d1cd454-87f7-4c9f-9d42-86aa46b2942c. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1948.352262] env[67820]: DEBUG nova.network.neutron [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Updating instance_info_cache with network_info: [{"id": "3d1cd454-87f7-4c9f-9d42-86aa46b2942c", "address": "fa:16:3e:f1:9d:a3", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3d1cd454-87", "ovs_interfaceid": "3d1cd454-87f7-4c9f-9d42-86aa46b2942c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1948.361918] env[67820]: DEBUG oslo_concurrency.lockutils [req-794e7df0-c0d4-4ba3-8148-e7041dec654a req-3fcbd635-f516-4582-836f-30ee77e6c554 service nova] Releasing lock "refresh_cache-eb759eb8-e670-4b9b-a0e0-865bdd53a208" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1948.782876] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1948.783214] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1948.783317] env[67820]: DEBUG oslo_concurrency.lockutils [None req-bc5ae1b1-fa91-4d7d-bc4a-49fb0a8f25e1 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1952.053889] env[67820]: DEBUG oslo_concurrency.lockutils [None req-80056740-5b62-4734-8c8d-bfab5788f7e2 tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquiring lock "fffda39c-1960-49f9-a26b-6b87e2c3c53e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1963.761143] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Acquiring lock "eda6fcc3-b964-4728-a2e2-ece044b0ffa2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1963.761431] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Lock "eda6fcc3-b964-4728-a2e2-ece044b0ffa2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1971.617055] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1975.621968] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1975.622268] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1975.622397] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1981.621597] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1981.621882] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1981.621926] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1981.646379] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.646829] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.646829] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.646829] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.646949] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.646988] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.647098] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.647231] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.647404] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.647467] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1981.647572] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1981.648082] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1982.621492] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1982.621831] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1982.633235] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1982.633452] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1982.633624] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1982.633779] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1982.634917] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-597c5409-8084-4222-92ae-ee6de1bf6d6a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.644334] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9970330e-3405-460a-8d9f-90f555628c1a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.661772] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6ad32d5-db64-4e19-bc2f-4fbc03d18e44 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.668193] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d08678c-6df0-4b55-aedc-ffe2c28eb530 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1982.697849] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180896MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1982.698054] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1982.698225] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1982.774073] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.774231] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.774370] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.774510] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.774871] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.774871] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.774871] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.775108] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.775108] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.775252] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1982.787465] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1982.798063] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1982.809166] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 75fc136a-9045-4b38-bb6c-37953cf8f778 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1982.819174] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eda6fcc3-b964-4728-a2e2-ece044b0ffa2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1982.819450] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1982.819606] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1983.011069] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4547f0b5-b316-4a14-8110-df0fdbf9c511 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.018504] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc7af60e-e3f8-4686-bbad-b63de68e5bd8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.048907] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83c048fc-b783-4ac6-924b-e6ae64abe17e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.056073] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83bf2e49-53e7-4bce-82d4-6a2ae025ed3d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1983.069620] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1983.080848] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1983.095902] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1983.095902] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.397s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1984.095557] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1984.622339] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1987.622409] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1991.029777] env[67820]: WARNING oslo_vmware.rw_handles [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1991.029777] env[67820]: ERROR oslo_vmware.rw_handles [ 1991.030716] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1991.032037] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1991.032323] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Copying Virtual Disk [datastore1] vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/831e1bf8-6ba2-43c7-94f1-7dd118bff3d5/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1991.032601] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e85b16ce-71bd-4df9-ae6c-e06795895130 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1991.040901] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Waiting for the task: (returnval){ [ 1991.040901] env[67820]: value = "task-3467491" [ 1991.040901] env[67820]: _type = "Task" [ 1991.040901] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1991.048659] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Task: {'id': task-3467491, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1991.553099] env[67820]: DEBUG oslo_vmware.exceptions [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1991.553099] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1991.553517] env[67820]: ERROR nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1991.553517] env[67820]: Faults: ['InvalidArgument'] [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Traceback (most recent call last): [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] yield resources [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self.driver.spawn(context, instance, image_meta, [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self._fetch_image_if_missing(context, vi) [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] image_cache(vi, tmp_image_ds_loc) [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] vm_util.copy_virtual_disk( [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] session._wait_for_task(vmdk_copy_task) [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] return self.wait_for_task(task_ref) [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] return evt.wait() [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] result = hub.switch() [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] return self.greenlet.switch() [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self.f(*self.args, **self.kw) [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] raise exceptions.translate_fault(task_info.error) [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Faults: ['InvalidArgument'] [ 1991.553517] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] [ 1991.554679] env[67820]: INFO nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Terminating instance [ 1991.555382] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1991.555589] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1991.555823] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-473906c5-254b-422f-a706-e1e70116cda0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1991.557932] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1991.558132] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1991.558841] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-233120fb-d471-4a4b-9b0b-1278dc82bcb8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1991.565399] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1991.565605] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c45ceeb9-7581-41b7-a9b7-42e1bba5065d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1991.567626] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1991.567799] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1991.568730] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6db4bda2-9ba4-4bfb-8951-919d3c1ce50e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1991.573540] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for the task: (returnval){ [ 1991.573540] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]528c84fe-2248-7802-18e4-e4e131d285de" [ 1991.573540] env[67820]: _type = "Task" [ 1991.573540] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1991.582600] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]528c84fe-2248-7802-18e4-e4e131d285de, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1991.637290] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1991.637559] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1991.637727] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Deleting the datastore file [datastore1] 0cda1de0-73dd-45dd-932b-75e59fb785cf {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1991.637996] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-37163c04-dd35-4a2d-bfc3-0a0dc49c6b76 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1991.644746] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Waiting for the task: (returnval){ [ 1991.644746] env[67820]: value = "task-3467493" [ 1991.644746] env[67820]: _type = "Task" [ 1991.644746] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1991.651982] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Task: {'id': task-3467493, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1992.083443] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1992.083785] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Creating directory with path [datastore1] vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1992.083901] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2a5b3d70-e5ec-4c7d-b567-a27eb9ecd84d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.096676] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Created directory with path [datastore1] vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1992.096856] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Fetch image to [datastore1] vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1992.097029] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1992.097769] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12f100da-e72b-41bc-8393-d4aec6eb4c7f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.104033] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d071d42-d212-4e21-8a6a-bfff62725a18 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.112611] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ec77d02-c4c5-4893-9328-4069baf5b48f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.142102] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-212f0617-8058-4ade-964f-76410378d450 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.150601] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8aeb8b13-bd5b-451b-b64a-670d03efd5af {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.154650] env[67820]: DEBUG oslo_vmware.api [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Task: {'id': task-3467493, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087027} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1992.155141] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1992.155351] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1992.155526] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1992.155698] env[67820]: INFO nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1992.157686] env[67820]: DEBUG nova.compute.claims [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1992.157852] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1992.158087] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1992.172488] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1992.225981] env[67820]: DEBUG oslo_vmware.rw_handles [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1992.285057] env[67820]: DEBUG oslo_vmware.rw_handles [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1992.285057] env[67820]: DEBUG oslo_vmware.rw_handles [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1992.410118] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07d9026a-f744-41c5-9939-6962b7ef88a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.417695] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ad6e4c-f1a7-4910-86bf-a3861d5ee814 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.446745] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e83bcd6-a33c-4a53-9eab-1d15663d4394 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.453664] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4850f3d6-b7d5-448c-9523-c77a0df507bf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.466380] env[67820]: DEBUG nova.compute.provider_tree [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1992.475926] env[67820]: DEBUG nova.scheduler.client.report [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1992.491408] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.333s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1992.491944] env[67820]: ERROR nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1992.491944] env[67820]: Faults: ['InvalidArgument'] [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Traceback (most recent call last): [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self.driver.spawn(context, instance, image_meta, [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self._fetch_image_if_missing(context, vi) [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] image_cache(vi, tmp_image_ds_loc) [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] vm_util.copy_virtual_disk( [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] session._wait_for_task(vmdk_copy_task) [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] return self.wait_for_task(task_ref) [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] return evt.wait() [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] result = hub.switch() [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] return self.greenlet.switch() [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] self.f(*self.args, **self.kw) [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] raise exceptions.translate_fault(task_info.error) [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Faults: ['InvalidArgument'] [ 1992.491944] env[67820]: ERROR nova.compute.manager [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] [ 1992.493031] env[67820]: DEBUG nova.compute.utils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1992.494868] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Build of instance 0cda1de0-73dd-45dd-932b-75e59fb785cf was re-scheduled: A specified parameter was not correct: fileType [ 1992.494868] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1992.494868] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1992.494988] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1992.495104] env[67820]: DEBUG nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1992.495268] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1992.933206] env[67820]: DEBUG nova.network.neutron [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1992.943714] env[67820]: INFO nova.compute.manager [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Took 0.45 seconds to deallocate network for instance. [ 1993.049050] env[67820]: INFO nova.scheduler.client.report [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Deleted allocations for instance 0cda1de0-73dd-45dd-932b-75e59fb785cf [ 1993.072848] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0e1fbd35-e1f9-4d95-b6b0-76c96cb2e7c3 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 634.413s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1993.074009] env[67820]: DEBUG oslo_concurrency.lockutils [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 439.073s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1993.074303] env[67820]: DEBUG oslo_concurrency.lockutils [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Acquiring lock "0cda1de0-73dd-45dd-932b-75e59fb785cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1993.074535] env[67820]: DEBUG oslo_concurrency.lockutils [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1993.074702] env[67820]: DEBUG oslo_concurrency.lockutils [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1993.077724] env[67820]: INFO nova.compute.manager [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Terminating instance [ 1993.079389] env[67820]: DEBUG nova.compute.manager [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1993.079612] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1993.079870] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e5aea01c-7b11-4df9-ba2f-ff3e956f52d2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.089575] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ac9e401-b656-4e90-87c7-4723ecbbddd7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.099847] env[67820]: DEBUG nova.compute.manager [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1993.119605] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0cda1de0-73dd-45dd-932b-75e59fb785cf could not be found. [ 1993.119780] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1993.119956] env[67820]: INFO nova.compute.manager [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1993.120203] env[67820]: DEBUG oslo.service.loopingcall [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1993.120424] env[67820]: DEBUG nova.compute.manager [-] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1993.120519] env[67820]: DEBUG nova.network.neutron [-] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1993.143441] env[67820]: DEBUG nova.network.neutron [-] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1993.150715] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1993.150937] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1993.152750] env[67820]: INFO nova.compute.claims [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1993.155483] env[67820]: INFO nova.compute.manager [-] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] Took 0.03 seconds to deallocate network for instance. [ 1993.241620] env[67820]: DEBUG oslo_concurrency.lockutils [None req-64520d20-a002-4040-96a1-b15aa66e0655 tempest-ServerActionsTestOtherA-1426582662 tempest-ServerActionsTestOtherA-1426582662-project-member] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1993.242859] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 242.631s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1993.243190] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 0cda1de0-73dd-45dd-932b-75e59fb785cf] During sync_power_state the instance has a pending task (deleting). Skip. [ 1993.243390] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "0cda1de0-73dd-45dd-932b-75e59fb785cf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1993.346631] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7660f2e9-7143-4983-84bb-b803454039e9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.354427] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b1bf33c-534f-4612-a2c2-ebda8b318c0c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.386792] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-078efa94-59fa-44b7-9900-5c52edae9e28 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.394697] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b02ded7-d651-42b7-8480-06334a2004ad {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.408421] env[67820]: DEBUG nova.compute.provider_tree [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1993.417525] env[67820]: DEBUG nova.scheduler.client.report [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1993.434491] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1993.434990] env[67820]: DEBUG nova.compute.manager [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1993.469654] env[67820]: DEBUG nova.compute.utils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1993.471247] env[67820]: DEBUG nova.compute.manager [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1993.471435] env[67820]: DEBUG nova.network.neutron [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1993.483441] env[67820]: DEBUG nova.compute.manager [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1993.550323] env[67820]: DEBUG nova.compute.manager [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1993.577624] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1993.577914] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1993.578093] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1993.578330] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1993.578462] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1993.578610] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1993.578817] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1993.578974] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1993.579154] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1993.579319] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1993.579492] env[67820]: DEBUG nova.virt.hardware [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1993.580439] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b92e332-349a-4f0a-a579-8078e0dd66aa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.588599] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b11da3-e3f1-46e1-83a4-4a6938f198eb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.762878] env[67820]: DEBUG nova.policy [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8b9868addae45a49b19e7058f737988', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '83044475bfd24b14a5a95b4b3fa0376c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 1994.083883] env[67820]: DEBUG nova.network.neutron [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Successfully created port: 291156bf-50a1-4c53-a777-951e337d9996 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1994.657108] env[67820]: DEBUG nova.network.neutron [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Successfully updated port: 291156bf-50a1-4c53-a777-951e337d9996 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1994.668712] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "refresh_cache-4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1994.668862] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired lock "refresh_cache-4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1994.669041] env[67820]: DEBUG nova.network.neutron [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1994.704273] env[67820]: DEBUG nova.network.neutron [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1994.861136] env[67820]: DEBUG nova.network.neutron [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Updating instance_info_cache with network_info: [{"id": "291156bf-50a1-4c53-a777-951e337d9996", "address": "fa:16:3e:c0:0c:a2", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap291156bf-50", "ovs_interfaceid": "291156bf-50a1-4c53-a777-951e337d9996", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1994.875856] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Releasing lock "refresh_cache-4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1994.876181] env[67820]: DEBUG nova.compute.manager [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Instance network_info: |[{"id": "291156bf-50a1-4c53-a777-951e337d9996", "address": "fa:16:3e:c0:0c:a2", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap291156bf-50", "ovs_interfaceid": "291156bf-50a1-4c53-a777-951e337d9996", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1994.876581] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:0c:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '365ac5b1-6d83-4dfe-887f-60574d7f6124', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '291156bf-50a1-4c53-a777-951e337d9996', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1994.886256] env[67820]: DEBUG oslo.service.loopingcall [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1994.886256] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1994.886256] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-118a214f-7df0-43cc-9ca3-e5b1dfac33fd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1994.906266] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1994.906266] env[67820]: value = "task-3467494" [ 1994.906266] env[67820]: _type = "Task" [ 1994.906266] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1994.914083] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467494, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1995.094915] env[67820]: DEBUG nova.compute.manager [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Received event network-vif-plugged-291156bf-50a1-4c53-a777-951e337d9996 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1995.095170] env[67820]: DEBUG oslo_concurrency.lockutils [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] Acquiring lock "4fb7ac00-ff06-4cf0-8a5e-41c76a390b38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1995.097159] env[67820]: DEBUG oslo_concurrency.lockutils [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] Lock "4fb7ac00-ff06-4cf0-8a5e-41c76a390b38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1995.097433] env[67820]: DEBUG oslo_concurrency.lockutils [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] Lock "4fb7ac00-ff06-4cf0-8a5e-41c76a390b38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1995.097618] env[67820]: DEBUG nova.compute.manager [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] No waiting events found dispatching network-vif-plugged-291156bf-50a1-4c53-a777-951e337d9996 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1995.097786] env[67820]: WARNING nova.compute.manager [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Received unexpected event network-vif-plugged-291156bf-50a1-4c53-a777-951e337d9996 for instance with vm_state building and task_state spawning. [ 1995.097946] env[67820]: DEBUG nova.compute.manager [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Received event network-changed-291156bf-50a1-4c53-a777-951e337d9996 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1995.098124] env[67820]: DEBUG nova.compute.manager [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Refreshing instance network info cache due to event network-changed-291156bf-50a1-4c53-a777-951e337d9996. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1995.098364] env[67820]: DEBUG oslo_concurrency.lockutils [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] Acquiring lock "refresh_cache-4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1995.098572] env[67820]: DEBUG oslo_concurrency.lockutils [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] Acquired lock "refresh_cache-4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1995.098682] env[67820]: DEBUG nova.network.neutron [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Refreshing network info cache for port 291156bf-50a1-4c53-a777-951e337d9996 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1995.103907] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5df0c74a-75f0-4887-9aff-8d1298116116 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "eb759eb8-e670-4b9b-a0e0-865bdd53a208" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1995.411257] env[67820]: DEBUG nova.network.neutron [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Updated VIF entry in instance network info cache for port 291156bf-50a1-4c53-a777-951e337d9996. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1995.411616] env[67820]: DEBUG nova.network.neutron [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Updating instance_info_cache with network_info: [{"id": "291156bf-50a1-4c53-a777-951e337d9996", "address": "fa:16:3e:c0:0c:a2", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap291156bf-50", "ovs_interfaceid": "291156bf-50a1-4c53-a777-951e337d9996", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1995.418121] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467494, 'name': CreateVM_Task, 'duration_secs': 0.372279} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1995.418401] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1995.419032] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1995.419197] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1995.419502] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1995.419743] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-26d4972e-178a-4861-9bd3-6f731c7f3357 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1995.422345] env[67820]: DEBUG oslo_concurrency.lockutils [req-5024775b-0090-46b5-b977-e8c55d3e219a req-4b78d0e5-7827-48b7-8465-0d3075ee6028 service nova] Releasing lock "refresh_cache-4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1995.425359] env[67820]: DEBUG oslo_vmware.api [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for the task: (returnval){ [ 1995.425359] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]523d72b6-60f9-942b-e990-7bfe86a32ce6" [ 1995.425359] env[67820]: _type = "Task" [ 1995.425359] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1995.433191] env[67820]: DEBUG oslo_vmware.api [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]523d72b6-60f9-942b-e990-7bfe86a32ce6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1995.936496] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1995.936842] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1995.936953] env[67820]: DEBUG oslo_concurrency.lockutils [None req-079fa2ca-62df-43a8-a46b-a711b605d9d2 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2027.625819] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2027.626081] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2027.641667] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] There are 0 instances to clean {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2028.826289] env[67820]: DEBUG oslo_concurrency.lockutils [None req-246c6507-9f69-4256-bf07-fe1429206af4 tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "4fb7ac00-ff06-4cf0-8a5e-41c76a390b38" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2036.621994] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.622308] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.622438] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances with incomplete migration {{(pid=67820) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2037.632171] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2037.632503] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2038.357704] env[67820]: WARNING oslo_vmware.rw_handles [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2038.357704] env[67820]: ERROR oslo_vmware.rw_handles [ 2038.358009] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2038.360400] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2038.360640] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Copying Virtual Disk [datastore1] vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/f3153b96-7f1f-41c2-a32e-d336944d2f21/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2038.360922] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f42bc2a3-a0f8-4c30-95d7-e1a989096fa2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.369271] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for the task: (returnval){ [ 2038.369271] env[67820]: value = "task-3467495" [ 2038.369271] env[67820]: _type = "Task" [ 2038.369271] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2038.378403] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Task: {'id': task-3467495, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2038.880556] env[67820]: DEBUG oslo_vmware.exceptions [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2038.880834] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2038.881432] env[67820]: ERROR nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2038.881432] env[67820]: Faults: ['InvalidArgument'] [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Traceback (most recent call last): [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] yield resources [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self.driver.spawn(context, instance, image_meta, [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self._fetch_image_if_missing(context, vi) [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] image_cache(vi, tmp_image_ds_loc) [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] vm_util.copy_virtual_disk( [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] session._wait_for_task(vmdk_copy_task) [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] return self.wait_for_task(task_ref) [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] return evt.wait() [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] result = hub.switch() [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] return self.greenlet.switch() [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self.f(*self.args, **self.kw) [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] raise exceptions.translate_fault(task_info.error) [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Faults: ['InvalidArgument'] [ 2038.881432] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] [ 2038.882178] env[67820]: INFO nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Terminating instance [ 2038.884074] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2038.884074] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2038.885031] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2038.885031] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2038.885143] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a8af0c66-2117-4edd-a5e4-31452872f2e8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.887676] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9adc6c0-b8e9-4dfe-9dc1-446c25fe331c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.895347] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2038.896422] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b2be5bf9-4d44-4ab3-9345-c0413ad410f1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.898851] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2038.902017] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2038.902017] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ad4f5173-3236-4adb-9956-5fa796637004 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.904859] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 2038.904859] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52f6b97e-aa82-8b64-b962-2c21764be52b" [ 2038.904859] env[67820]: _type = "Task" [ 2038.904859] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2038.913728] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52f6b97e-aa82-8b64-b962-2c21764be52b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2038.960439] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2038.960650] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2038.960833] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Deleting the datastore file [datastore1] 99f872a5-2e7d-42b9-a94f-67153db8d0ad {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2038.961122] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7856c5e0-e166-40af-87a1-73a4e2fd9fb7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2038.967073] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for the task: (returnval){ [ 2038.967073] env[67820]: value = "task-3467497" [ 2038.967073] env[67820]: _type = "Task" [ 2038.967073] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2038.974406] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Task: {'id': task-3467497, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2039.415393] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2039.415706] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating directory with path [datastore1] vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2039.415951] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5e4a868e-3533-4789-ab6b-8a4710518186 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.427188] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Created directory with path [datastore1] vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2039.427384] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Fetch image to [datastore1] vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2039.427569] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2039.428317] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b32cef9-1422-4178-b861-6f21ea5af5d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.434588] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b065859-cf96-4adf-bca3-88ed54d00d3c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.444203] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4014d67-b47e-4d39-9cae-5fd9ee797048 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.475865] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e5e0090-dbfe-4e40-aee8-81149ad2675f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.483471] env[67820]: DEBUG oslo_vmware.api [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Task: {'id': task-3467497, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073753} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2039.484843] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2039.485042] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2039.485218] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2039.485388] env[67820]: INFO nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2039.487116] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6226fb80-9714-41cb-bcbb-6fffe359d0e8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.488958] env[67820]: DEBUG nova.compute.claims [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2039.489148] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2039.489358] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2039.510026] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2039.566863] env[67820]: DEBUG oslo_vmware.rw_handles [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2039.625643] env[67820]: DEBUG nova.scheduler.client.report [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Refreshing inventories for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2039.630023] env[67820]: DEBUG oslo_vmware.rw_handles [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2039.630023] env[67820]: DEBUG oslo_vmware.rw_handles [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2039.639155] env[67820]: DEBUG nova.scheduler.client.report [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Updating ProviderTree inventory for provider 0f792661-ec04-4fc2-898f-e9860339eddd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2039.639397] env[67820]: DEBUG nova.compute.provider_tree [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2039.650729] env[67820]: DEBUG nova.scheduler.client.report [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Refreshing aggregate associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, aggregates: None {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2039.671196] env[67820]: DEBUG nova.scheduler.client.report [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Refreshing trait associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, traits: COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2039.821466] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-557804b5-f204-472c-9a4d-602c6eafb0d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.829163] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad89e4e4-b31d-42fb-995c-e741d4b8a616 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.858785] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-988295a9-d9e4-47fb-8fad-b1b64d3f32ef {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.865405] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5419e96-805a-45f6-810c-c7dfa205a604 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2039.879197] env[67820]: DEBUG nova.compute.provider_tree [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2039.887788] env[67820]: DEBUG nova.scheduler.client.report [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2039.902397] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.413s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2039.902915] env[67820]: ERROR nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2039.902915] env[67820]: Faults: ['InvalidArgument'] [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Traceback (most recent call last): [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self.driver.spawn(context, instance, image_meta, [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self._fetch_image_if_missing(context, vi) [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] image_cache(vi, tmp_image_ds_loc) [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] vm_util.copy_virtual_disk( [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] session._wait_for_task(vmdk_copy_task) [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] return self.wait_for_task(task_ref) [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] return evt.wait() [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] result = hub.switch() [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] return self.greenlet.switch() [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] self.f(*self.args, **self.kw) [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] raise exceptions.translate_fault(task_info.error) [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Faults: ['InvalidArgument'] [ 2039.902915] env[67820]: ERROR nova.compute.manager [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] [ 2039.903621] env[67820]: DEBUG nova.compute.utils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2039.905245] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Build of instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad was re-scheduled: A specified parameter was not correct: fileType [ 2039.905245] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2039.905643] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2039.905814] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2039.905997] env[67820]: DEBUG nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2039.906174] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2040.378205] env[67820]: DEBUG nova.network.neutron [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2040.391491] env[67820]: INFO nova.compute.manager [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Took 0.49 seconds to deallocate network for instance. [ 2040.490548] env[67820]: INFO nova.scheduler.client.report [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Deleted allocations for instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad [ 2040.512536] env[67820]: DEBUG oslo_concurrency.lockutils [None req-5c102b71-2a66-4dc3-ab3d-b310a120bbff tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.082s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2040.514027] env[67820]: DEBUG oslo_concurrency.lockutils [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.812s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2040.514027] env[67820]: DEBUG oslo_concurrency.lockutils [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2040.514186] env[67820]: DEBUG oslo_concurrency.lockutils [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2040.514307] env[67820]: DEBUG oslo_concurrency.lockutils [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2040.516190] env[67820]: INFO nova.compute.manager [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Terminating instance [ 2040.518111] env[67820]: DEBUG nova.compute.manager [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2040.518189] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2040.518627] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f0823e40-0331-4004-b8f8-be2e1833166a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2040.527457] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-478856e0-9abb-4145-98fb-5d67b23ffadf {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2040.538527] env[67820]: DEBUG nova.compute.manager [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2040.563314] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 99f872a5-2e7d-42b9-a94f-67153db8d0ad could not be found. [ 2040.563522] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2040.563702] env[67820]: INFO nova.compute.manager [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Took 0.05 seconds to destroy the instance on the hypervisor. [ 2040.563948] env[67820]: DEBUG oslo.service.loopingcall [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2040.564229] env[67820]: DEBUG nova.compute.manager [-] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2040.564326] env[67820]: DEBUG nova.network.neutron [-] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2040.586743] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2040.587052] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2040.588481] env[67820]: INFO nova.compute.claims [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2040.591215] env[67820]: DEBUG nova.network.neutron [-] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2040.599496] env[67820]: INFO nova.compute.manager [-] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] Took 0.03 seconds to deallocate network for instance. [ 2040.698328] env[67820]: DEBUG oslo_concurrency.lockutils [None req-453daa3b-ea2a-4c39-ad34-cbe15459e858 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2040.699168] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 290.087s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2040.699352] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 99f872a5-2e7d-42b9-a94f-67153db8d0ad] During sync_power_state the instance has a pending task (deleting). Skip. [ 2040.699533] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "99f872a5-2e7d-42b9-a94f-67153db8d0ad" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2040.789797] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ebe7fe3-1d00-4ea7-b4d2-782c48f7fc60 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2040.797881] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0671cd80-4cf6-4355-9685-7795eb1927a5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2040.828815] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a12807d-912e-4ecc-bfdf-d38a5b052b31 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2040.835849] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28967717-f5ed-4b34-ab85-9f56773dfcff {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2040.849114] env[67820]: DEBUG nova.compute.provider_tree [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2040.858012] env[67820]: DEBUG nova.scheduler.client.report [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2040.871542] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2040.871999] env[67820]: DEBUG nova.compute.manager [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2040.903431] env[67820]: DEBUG nova.compute.utils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2040.905711] env[67820]: DEBUG nova.compute.manager [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2040.905711] env[67820]: DEBUG nova.network.neutron [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2040.913166] env[67820]: DEBUG nova.compute.manager [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2040.977843] env[67820]: DEBUG nova.compute.manager [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2040.998529] env[67820]: DEBUG nova.policy [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd81a07fc3b4c470c8f5a087d8825c5df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '890ffca423414cd69eca6a6bf4d1ac66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 2041.003111] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2041.003337] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2041.003493] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2041.003672] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2041.003820] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2041.003964] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2041.004188] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2041.004349] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2041.004512] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2041.004676] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2041.004846] env[67820]: DEBUG nova.virt.hardware [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2041.005703] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e38b7f19-4c04-4363-8155-b857330b46f1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.013651] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52e5d29b-87ef-4d0b-9b1a-edaa6b0126e3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2041.309716] env[67820]: DEBUG nova.network.neutron [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Successfully created port: 22966a38-223d-4a26-93ea-ec38bb53e7ae {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2041.622187] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2041.622370] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2041.622492] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2041.645671] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.645845] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.645974] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646126] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646249] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646366] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646637] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646637] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646750] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646836] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2041.646965] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2041.916277] env[67820]: DEBUG nova.network.neutron [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Successfully updated port: 22966a38-223d-4a26-93ea-ec38bb53e7ae {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2041.928917] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "refresh_cache-1068e5cc-2514-4e07-aeee-e7e64c95a979" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2041.928917] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "refresh_cache-1068e5cc-2514-4e07-aeee-e7e64c95a979" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2041.928917] env[67820]: DEBUG nova.network.neutron [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2041.990487] env[67820]: DEBUG nova.network.neutron [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2042.209272] env[67820]: DEBUG nova.network.neutron [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Updating instance_info_cache with network_info: [{"id": "22966a38-223d-4a26-93ea-ec38bb53e7ae", "address": "fa:16:3e:64:d3:8b", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap22966a38-22", "ovs_interfaceid": "22966a38-223d-4a26-93ea-ec38bb53e7ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2042.222355] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "refresh_cache-1068e5cc-2514-4e07-aeee-e7e64c95a979" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2042.222646] env[67820]: DEBUG nova.compute.manager [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Instance network_info: |[{"id": "22966a38-223d-4a26-93ea-ec38bb53e7ae", "address": "fa:16:3e:64:d3:8b", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap22966a38-22", "ovs_interfaceid": "22966a38-223d-4a26-93ea-ec38bb53e7ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2042.223037] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:64:d3:8b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db00ec2e-3155-46b6-8170-082f7d86dbe7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '22966a38-223d-4a26-93ea-ec38bb53e7ae', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2042.230897] env[67820]: DEBUG oslo.service.loopingcall [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2042.231575] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2042.231891] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a516448c-36b1-4499-afd2-c6871c092573 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2042.252487] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2042.252487] env[67820]: value = "task-3467498" [ 2042.252487] env[67820]: _type = "Task" [ 2042.252487] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2042.259877] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467498, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2042.417730] env[67820]: DEBUG nova.compute.manager [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Received event network-vif-plugged-22966a38-223d-4a26-93ea-ec38bb53e7ae {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2042.417934] env[67820]: DEBUG oslo_concurrency.lockutils [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] Acquiring lock "1068e5cc-2514-4e07-aeee-e7e64c95a979-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2042.418159] env[67820]: DEBUG oslo_concurrency.lockutils [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] Lock "1068e5cc-2514-4e07-aeee-e7e64c95a979-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2042.418354] env[67820]: DEBUG oslo_concurrency.lockutils [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] Lock "1068e5cc-2514-4e07-aeee-e7e64c95a979-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2042.418496] env[67820]: DEBUG nova.compute.manager [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] No waiting events found dispatching network-vif-plugged-22966a38-223d-4a26-93ea-ec38bb53e7ae {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2042.418690] env[67820]: WARNING nova.compute.manager [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Received unexpected event network-vif-plugged-22966a38-223d-4a26-93ea-ec38bb53e7ae for instance with vm_state building and task_state spawning. [ 2042.418854] env[67820]: DEBUG nova.compute.manager [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Received event network-changed-22966a38-223d-4a26-93ea-ec38bb53e7ae {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2042.419017] env[67820]: DEBUG nova.compute.manager [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Refreshing instance network info cache due to event network-changed-22966a38-223d-4a26-93ea-ec38bb53e7ae. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2042.419204] env[67820]: DEBUG oslo_concurrency.lockutils [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] Acquiring lock "refresh_cache-1068e5cc-2514-4e07-aeee-e7e64c95a979" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2042.419338] env[67820]: DEBUG oslo_concurrency.lockutils [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] Acquired lock "refresh_cache-1068e5cc-2514-4e07-aeee-e7e64c95a979" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2042.419488] env[67820]: DEBUG nova.network.neutron [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Refreshing network info cache for port 22966a38-223d-4a26-93ea-ec38bb53e7ae {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2042.621511] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2042.673025] env[67820]: DEBUG nova.network.neutron [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Updated VIF entry in instance network info cache for port 22966a38-223d-4a26-93ea-ec38bb53e7ae. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2042.673422] env[67820]: DEBUG nova.network.neutron [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Updating instance_info_cache with network_info: [{"id": "22966a38-223d-4a26-93ea-ec38bb53e7ae", "address": "fa:16:3e:64:d3:8b", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap22966a38-22", "ovs_interfaceid": "22966a38-223d-4a26-93ea-ec38bb53e7ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2042.685163] env[67820]: DEBUG oslo_concurrency.lockutils [req-a125bd8a-f5c7-4db1-907a-2ce4da7016de req-95ffbf2a-100f-4d1d-a7b2-61ccaf5f20ac service nova] Releasing lock "refresh_cache-1068e5cc-2514-4e07-aeee-e7e64c95a979" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2042.763169] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467498, 'name': CreateVM_Task, 'duration_secs': 0.281744} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2042.763290] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2042.763923] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2042.764101] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2042.764440] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2042.764654] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bfa8dca1-d6cb-471d-9705-7c91c92c8df7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2042.769103] env[67820]: DEBUG oslo_vmware.api [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 2042.769103] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52bb3f8b-c37d-5423-abd8-f975433f8313" [ 2042.769103] env[67820]: _type = "Task" [ 2042.769103] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2042.776934] env[67820]: DEBUG oslo_vmware.api [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52bb3f8b-c37d-5423-abd8-f975433f8313, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2043.279872] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2043.279872] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2043.279872] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f1dd2846-b717-4eeb-bf27-49724fd13cf4 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2043.621505] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.616137] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.620831] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.632771] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.632996] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2044.633181] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2044.633337] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2044.634459] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9873fcfb-660b-4d41-a00c-23bd3172d5ea {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.644251] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b5eee5d-2b8d-4fb1-996d-842cef73c826 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.658046] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-662ef49e-053a-454c-823e-cdcadfa4dfe2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.664338] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01285d79-bd78-47b3-b9ec-0afda1ea633b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.694146] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180939MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2044.694328] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.694503] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2044.785896] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.786197] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.786416] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.786622] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.786830] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.787045] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.787259] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.787467] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.787696] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.787902] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.799573] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 75fc136a-9045-4b38-bb6c-37953cf8f778 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2044.830400] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eda6fcc3-b964-4728-a2e2-ece044b0ffa2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2044.830640] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2044.830823] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2044.968286] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e716e75b-02ce-4bf9-954c-28cbdc6dae0c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.975980] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4b5284d-6e2b-45ef-b185-5325958610c5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.005020] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-775296bf-d798-41a8-b61f-8794aae0bf88 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.012308] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-906069f5-1485-4ee9-b4ac-dc0cb2443912 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2045.025080] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2045.033504] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2045.063481] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2045.063688] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.369s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2045.621571] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2046.629091] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2047.622383] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2054.155920] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "faa3fbe8-d076-422d-98ba-bfde42fb0580" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2054.156222] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "faa3fbe8-d076-422d-98ba-bfde42fb0580" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2088.376033] env[67820]: WARNING oslo_vmware.rw_handles [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2088.376033] env[67820]: ERROR oslo_vmware.rw_handles [ 2088.376575] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2088.378489] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2088.378760] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Copying Virtual Disk [datastore1] vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/1a658529-ab5b-4b72-a3f3-8aab5ad4c77d/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2088.379063] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-40aaf659-6438-4253-af8c-adc57c6b2bed {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.386933] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 2088.386933] env[67820]: value = "task-3467499" [ 2088.386933] env[67820]: _type = "Task" [ 2088.386933] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2088.394644] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467499, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2088.898087] env[67820]: DEBUG oslo_vmware.exceptions [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2088.898087] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2088.898325] env[67820]: ERROR nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2088.898325] env[67820]: Faults: ['InvalidArgument'] [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Traceback (most recent call last): [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] yield resources [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self.driver.spawn(context, instance, image_meta, [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self._fetch_image_if_missing(context, vi) [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] image_cache(vi, tmp_image_ds_loc) [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] vm_util.copy_virtual_disk( [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] session._wait_for_task(vmdk_copy_task) [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] return self.wait_for_task(task_ref) [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] return evt.wait() [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] result = hub.switch() [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] return self.greenlet.switch() [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self.f(*self.args, **self.kw) [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] raise exceptions.translate_fault(task_info.error) [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Faults: ['InvalidArgument'] [ 2088.898325] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] [ 2088.899139] env[67820]: INFO nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Terminating instance [ 2088.900218] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2088.900423] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2088.900661] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a16477e5-e549-47c7-b4c2-6ba5a339dd9e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.902905] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2088.903104] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2088.903936] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e45de94-db52-4e30-941c-b01d9966301c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.910733] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2088.910917] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0eae7764-29e1-4f3b-88db-58c614712aac {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.913112] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2088.913286] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2088.914254] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4827b663-3a18-4260-a343-5fcca22d0f44 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.918804] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Waiting for the task: (returnval){ [ 2088.918804] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d01cdb-14a3-5a2b-4024-9e2d8ae7af14" [ 2088.918804] env[67820]: _type = "Task" [ 2088.918804] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2088.925925] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d01cdb-14a3-5a2b-4024-9e2d8ae7af14, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2088.979799] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2088.980047] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2088.980236] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleting the datastore file [datastore1] 276123c5-3edc-4e33-9b13-baae0fc9de9f {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2088.980529] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b64c553e-fc5e-413b-bec8-4ff46d360230 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2088.986741] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 2088.986741] env[67820]: value = "task-3467501" [ 2088.986741] env[67820]: _type = "Task" [ 2088.986741] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2088.994813] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467501, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2089.306316] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9aad4ce7-1deb-4538-893d-bdec07215c42 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "1068e5cc-2514-4e07-aeee-e7e64c95a979" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2089.430754] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2089.431072] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Creating directory with path [datastore1] vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2089.431270] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0aff2bf-be95-481a-9a28-328d7538bf06 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.442797] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Created directory with path [datastore1] vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2089.443016] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Fetch image to [datastore1] vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2089.443216] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2089.444018] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f618c70b-50f7-48df-b9c9-b04e73493757 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.450849] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c67532d-057a-4b29-9a81-ea36748bcf34 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.459747] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95ccf209-db2b-4d1f-a998-436bf245334a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.491624] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd2da97f-cfe5-4e53-a2b5-d22b289dbdac {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.497997] env[67820]: DEBUG oslo_vmware.api [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': task-3467501, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078079} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2089.499340] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2089.499545] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2089.499705] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2089.499873] env[67820]: INFO nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2089.501802] env[67820]: DEBUG nova.compute.claims [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2089.502020] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2089.502210] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2089.504557] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5239ad6d-8e3b-4874-bf62-4247b937f34d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.526916] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2089.655496] env[67820]: DEBUG oslo_vmware.rw_handles [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2089.717661] env[67820]: DEBUG oslo_vmware.rw_handles [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2089.717847] env[67820]: DEBUG oslo_vmware.rw_handles [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2089.759537] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-379c01ed-7cdb-4fc8-9e94-19f903d159fd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.767307] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c77fc49d-9db3-47a5-9456-f2ff67f3d560 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.797859] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97b48222-ee23-4343-b802-be0d87190d6e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.804900] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a0609f-148c-48f3-9ea7-01f1c0b568ed {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.817769] env[67820]: DEBUG nova.compute.provider_tree [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2089.825909] env[67820]: DEBUG nova.scheduler.client.report [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2089.840637] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.338s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.841156] env[67820]: ERROR nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2089.841156] env[67820]: Faults: ['InvalidArgument'] [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Traceback (most recent call last): [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self.driver.spawn(context, instance, image_meta, [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self._fetch_image_if_missing(context, vi) [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] image_cache(vi, tmp_image_ds_loc) [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] vm_util.copy_virtual_disk( [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] session._wait_for_task(vmdk_copy_task) [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] return self.wait_for_task(task_ref) [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] return evt.wait() [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] result = hub.switch() [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] return self.greenlet.switch() [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] self.f(*self.args, **self.kw) [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] raise exceptions.translate_fault(task_info.error) [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Faults: ['InvalidArgument'] [ 2089.841156] env[67820]: ERROR nova.compute.manager [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] [ 2089.841966] env[67820]: DEBUG nova.compute.utils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2089.843115] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Build of instance 276123c5-3edc-4e33-9b13-baae0fc9de9f was re-scheduled: A specified parameter was not correct: fileType [ 2089.843115] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2089.843479] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2089.843646] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2089.843813] env[67820]: DEBUG nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2089.843969] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2090.230110] env[67820]: DEBUG nova.network.neutron [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2090.247773] env[67820]: INFO nova.compute.manager [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Took 0.40 seconds to deallocate network for instance. [ 2090.337753] env[67820]: INFO nova.scheduler.client.report [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Deleted allocations for instance 276123c5-3edc-4e33-9b13-baae0fc9de9f [ 2090.358264] env[67820]: DEBUG oslo_concurrency.lockutils [None req-7d6a5a96-4b68-457f-bd77-50ebc72afe95 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 590.251s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2090.359408] env[67820]: DEBUG oslo_concurrency.lockutils [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 394.577s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2090.359656] env[67820]: DEBUG oslo_concurrency.lockutils [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "276123c5-3edc-4e33-9b13-baae0fc9de9f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2090.359870] env[67820]: DEBUG oslo_concurrency.lockutils [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2090.360047] env[67820]: DEBUG oslo_concurrency.lockutils [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2090.361979] env[67820]: INFO nova.compute.manager [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Terminating instance [ 2090.363634] env[67820]: DEBUG nova.compute.manager [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2090.363823] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2090.364298] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e9c01ab4-e6b0-4f62-8ab5-70f20ac62b5d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.373480] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b80e5949-15e5-43ff-8e1a-0c76f892c629 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.384872] env[67820]: DEBUG nova.compute.manager [None req-f6e8c092-e11d-4603-8251-380e9e97564b tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 75fc136a-9045-4b38-bb6c-37953cf8f778] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2090.406899] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 276123c5-3edc-4e33-9b13-baae0fc9de9f could not be found. [ 2090.407113] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2090.407293] env[67820]: INFO nova.compute.manager [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2090.407562] env[67820]: DEBUG oslo.service.loopingcall [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2090.407754] env[67820]: DEBUG nova.compute.manager [-] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2090.407850] env[67820]: DEBUG nova.network.neutron [-] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2090.410236] env[67820]: DEBUG nova.compute.manager [None req-f6e8c092-e11d-4603-8251-380e9e97564b tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: 75fc136a-9045-4b38-bb6c-37953cf8f778] Instance disappeared before build. {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 2090.441057] env[67820]: DEBUG oslo_concurrency.lockutils [None req-f6e8c092-e11d-4603-8251-380e9e97564b tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "75fc136a-9045-4b38-bb6c-37953cf8f778" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 195.991s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2090.444462] env[67820]: DEBUG nova.network.neutron [-] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2090.453161] env[67820]: INFO nova.compute.manager [-] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] Took 0.05 seconds to deallocate network for instance. [ 2090.455096] env[67820]: DEBUG nova.compute.manager [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2090.517622] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2090.517890] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2090.520110] env[67820]: INFO nova.compute.claims [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2090.562018] env[67820]: DEBUG oslo_concurrency.lockutils [None req-eb38fe34-c039-40a6-ba4a-716142c86a88 tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.203s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2090.562971] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 339.951s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2090.563173] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 276123c5-3edc-4e33-9b13-baae0fc9de9f] During sync_power_state the instance has a pending task (deleting). Skip. [ 2090.563348] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "276123c5-3edc-4e33-9b13-baae0fc9de9f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2090.714716] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6997fad2-5b75-4aaa-b3ab-b29a5882ab39 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.722892] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6456e9d7-01f8-4eeb-8a95-3f40856c9d49 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.752629] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b77ac60-feab-48c0-88f7-3c420039d6c0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.760043] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2e1bfc0-cb8d-427f-9aec-5f2cf7e7f976 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.774649] env[67820]: DEBUG nova.compute.provider_tree [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2090.784070] env[67820]: DEBUG nova.scheduler.client.report [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2090.797465] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2090.797954] env[67820]: DEBUG nova.compute.manager [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2090.830360] env[67820]: DEBUG nova.compute.utils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2090.831504] env[67820]: DEBUG nova.compute.manager [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Not allocating networking since 'none' was specified. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 2090.840235] env[67820]: DEBUG nova.compute.manager [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2090.911366] env[67820]: DEBUG nova.compute.manager [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2090.936985] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2090.937244] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2090.937411] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2090.937674] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2090.937841] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2090.937993] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2090.938233] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2090.938392] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2090.938567] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2090.938743] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2090.938913] env[67820]: DEBUG nova.virt.hardware [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2090.939758] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfe27741-901b-47dd-b826-912c98b14ffb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.947724] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-891233c2-3621-48bc-8a92-dc85bbc2031b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.960717] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Instance VIF info [] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2090.966247] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Creating folder: Project (30f88b3d49704d7cababd156c76d27c3). Parent ref: group-v692668. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2090.966504] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fa2299ab-b82f-4bbd-95b3-672fa4a467f5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.976772] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "a3a11e77-9a46-442d-84d0-09f08acbfc64" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2090.977126] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "a3a11e77-9a46-442d-84d0-09f08acbfc64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2090.977223] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Created folder: Project (30f88b3d49704d7cababd156c76d27c3) in parent group-v692668. [ 2090.977355] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Creating folder: Instances. Parent ref: group-v692779. {{(pid=67820) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2090.977557] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c32323de-57cd-41ba-9dd1-4bb63d012470 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2090.986092] env[67820]: INFO nova.virt.vmwareapi.vm_util [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Created folder: Instances in parent group-v692779. [ 2090.986309] env[67820]: DEBUG oslo.service.loopingcall [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2090.986480] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2090.986699] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-48d56ba2-abd4-4b38-b4b8-956c511267cc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2091.001680] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2091.001680] env[67820]: value = "task-3467504" [ 2091.001680] env[67820]: _type = "Task" [ 2091.001680] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2091.008783] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467504, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2091.511703] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467504, 'name': CreateVM_Task, 'duration_secs': 0.252413} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2091.512047] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2091.512254] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2091.512416] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2091.512742] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2091.512976] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9beacdee-b7a8-4019-9a57-23c837db255a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2091.517289] env[67820]: DEBUG oslo_vmware.api [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Waiting for the task: (returnval){ [ 2091.517289] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52b0f2ce-8df8-9bff-5cef-a1e1c10518d9" [ 2091.517289] env[67820]: _type = "Task" [ 2091.517289] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2091.524917] env[67820]: DEBUG oslo_vmware.api [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52b0f2ce-8df8-9bff-5cef-a1e1c10518d9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2092.026903] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2092.027120] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2092.027330] env[67820]: DEBUG oslo_concurrency.lockutils [None req-b0dc8e8b-a953-425c-a0f1-9bab426f53e3 tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2094.617583] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2098.622592] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2099.622785] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2099.623200] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2102.622027] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2103.622463] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2103.622755] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2103.622796] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2103.643457] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.643617] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.643731] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.643857] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.643983] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.644120] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.644265] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.644393] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.644510] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.644626] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2103.644753] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2104.622075] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2104.636557] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2104.636799] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.636980] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.637213] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2104.638391] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faa2d282-004a-44be-99a8-5be0e1e0bf13 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.646964] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22953de8-08f3-44e3-99bb-b861a46cd429 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.660627] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-967b8da4-4c5d-421a-8230-fcbb16809a80 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.666618] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82027147-17ed-4cb0-94f4-1f16b646b80d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.694718] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180884MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2104.694861] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2104.695063] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.761407] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.761580] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.761705] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.761823] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.761963] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.762138] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.762262] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.762378] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.762490] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.762603] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eda6fcc3-b964-4728-a2e2-ece044b0ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2104.773093] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance faa3fbe8-d076-422d-98ba-bfde42fb0580 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2104.783379] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a3a11e77-9a46-442d-84d0-09f08acbfc64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2104.783588] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2104.783744] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2104.926119] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61bb4c2-1036-43fb-9dd8-8d3bd9709b97 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.934987] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3acea40-f67c-4474-b48f-34e875cff2df {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.963884] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bae93d3-b7a6-452d-94ac-2ba535acd1c7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.970718] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93295edf-295c-4048-bbe8-1ac55beea5ab {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.983274] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2104.992596] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2105.006902] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2105.007238] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.312s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2106.006242] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2106.616760] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2106.621393] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2108.622067] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2139.018840] env[67820]: WARNING oslo_vmware.rw_handles [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2139.018840] env[67820]: ERROR oslo_vmware.rw_handles [ 2139.019612] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2139.021541] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2139.021796] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Copying Virtual Disk [datastore1] vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/b49aa6d6-b127-441c-8b7f-69415fb95a1f/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2139.022083] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5f1c044f-4fd1-4899-927c-2b4041bbfa15 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2139.029847] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Waiting for the task: (returnval){ [ 2139.029847] env[67820]: value = "task-3467505" [ 2139.029847] env[67820]: _type = "Task" [ 2139.029847] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2139.037683] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Task: {'id': task-3467505, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2139.542147] env[67820]: DEBUG oslo_vmware.exceptions [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2139.542455] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2139.543043] env[67820]: ERROR nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2139.543043] env[67820]: Faults: ['InvalidArgument'] [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Traceback (most recent call last): [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] yield resources [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self.driver.spawn(context, instance, image_meta, [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self._fetch_image_if_missing(context, vi) [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] image_cache(vi, tmp_image_ds_loc) [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] vm_util.copy_virtual_disk( [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] session._wait_for_task(vmdk_copy_task) [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] return self.wait_for_task(task_ref) [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] return evt.wait() [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] result = hub.switch() [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] return self.greenlet.switch() [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self.f(*self.args, **self.kw) [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] raise exceptions.translate_fault(task_info.error) [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Faults: ['InvalidArgument'] [ 2139.543043] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] [ 2139.543828] env[67820]: INFO nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Terminating instance [ 2139.544915] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2139.545167] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2139.545413] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3db79d89-76d0-4368-b59d-8f579caa7f2b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2139.547460] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2139.547641] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquired lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2139.547815] env[67820]: DEBUG nova.network.neutron [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2139.554329] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2139.554517] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2139.555690] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6c33a01e-0cda-4e20-b2ac-a6574c95a524 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2139.562933] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Waiting for the task: (returnval){ [ 2139.562933] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]520c6c95-5cc2-a062-332b-d1e8950a745c" [ 2139.562933] env[67820]: _type = "Task" [ 2139.562933] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2139.570182] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]520c6c95-5cc2-a062-332b-d1e8950a745c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2139.586454] env[67820]: DEBUG nova.network.neutron [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2139.644376] env[67820]: DEBUG nova.network.neutron [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2139.652802] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Releasing lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2139.653197] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2139.653387] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2139.654424] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-369a3040-dadb-40c4-b3e9-a26564ed96bb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2139.661861] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2139.662084] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8971d4d7-1e20-4fc7-97e3-94679fe8d6ea {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2139.691449] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2139.691652] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2139.691832] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Deleting the datastore file [datastore1] 1834f1ac-f85c-4176-b3c3-e85d50561b4a {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2139.692085] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-26af3445-a937-4e43-999b-7e1048172f96 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2139.698146] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Waiting for the task: (returnval){ [ 2139.698146] env[67820]: value = "task-3467507" [ 2139.698146] env[67820]: _type = "Task" [ 2139.698146] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2139.705252] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Task: {'id': task-3467507, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2140.073197] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2140.073488] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Creating directory with path [datastore1] vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2140.073686] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-be2a3a54-858c-4aff-95cb-8e9189189c1e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.084566] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Created directory with path [datastore1] vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2140.084746] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Fetch image to [datastore1] vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2140.084916] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2140.085616] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3751a2e3-c44a-4b26-8de7-163a023bcb67 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.091945] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ec29fd6-814f-459e-8f6f-a0f9da1727ec {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.100881] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37c3b990-7a23-47d4-b9a6-8e58e3596544 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.131499] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58564f88-b729-4451-9462-12abb98af3da {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.136533] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c48c5a52-c295-4f27-88df-3ac8dae33a52 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.155754] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2140.207829] env[67820]: DEBUG oslo_vmware.api [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Task: {'id': task-3467507, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044051} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2140.208093] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2140.208278] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2140.208447] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2140.208618] env[67820]: INFO nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Took 0.56 seconds to destroy the instance on the hypervisor. [ 2140.208866] env[67820]: DEBUG oslo.service.loopingcall [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2140.209230] env[67820]: DEBUG nova.compute.manager [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network deallocation for instance since networking was not requested. {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2140.211219] env[67820]: DEBUG nova.compute.claims [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2140.211387] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2140.211598] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2140.292662] env[67820]: DEBUG oslo_vmware.rw_handles [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2140.352570] env[67820]: DEBUG oslo_vmware.rw_handles [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2140.352756] env[67820]: DEBUG oslo_vmware.rw_handles [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2140.458567] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a5aad9-1ae6-4d3e-afbf-2a14101f0b11 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.466158] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-822aeee3-ed27-4008-9ef7-006cb73ec6a1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.495606] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d29e097b-75b2-4153-b927-a8ace8675613 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.502328] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32ea6aae-67d2-48e2-b7d4-fa0122e7306b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.514965] env[67820]: DEBUG nova.compute.provider_tree [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2140.523811] env[67820]: DEBUG nova.scheduler.client.report [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2140.538513] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2140.539158] env[67820]: ERROR nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2140.539158] env[67820]: Faults: ['InvalidArgument'] [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Traceback (most recent call last): [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self.driver.spawn(context, instance, image_meta, [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self._fetch_image_if_missing(context, vi) [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] image_cache(vi, tmp_image_ds_loc) [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] vm_util.copy_virtual_disk( [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] session._wait_for_task(vmdk_copy_task) [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] return self.wait_for_task(task_ref) [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] return evt.wait() [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] result = hub.switch() [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] return self.greenlet.switch() [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] self.f(*self.args, **self.kw) [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] raise exceptions.translate_fault(task_info.error) [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Faults: ['InvalidArgument'] [ 2140.539158] env[67820]: ERROR nova.compute.manager [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] [ 2140.539974] env[67820]: DEBUG nova.compute.utils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2140.541146] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Build of instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a was re-scheduled: A specified parameter was not correct: fileType [ 2140.541146] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2140.541522] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2140.541763] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2140.541925] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquired lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2140.542101] env[67820]: DEBUG nova.network.neutron [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2140.570140] env[67820]: DEBUG nova.network.neutron [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2140.654556] env[67820]: DEBUG nova.network.neutron [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2140.663848] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Releasing lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2140.664088] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2140.664272] env[67820]: DEBUG nova.compute.manager [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Skipping network deallocation for instance since networking was not requested. {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2140.750391] env[67820]: INFO nova.scheduler.client.report [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Deleted allocations for instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a [ 2140.768850] env[67820]: DEBUG oslo_concurrency.lockutils [None req-79c4f2ad-e98b-4f46-8a40-0814a4cce8a4 tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 570.464s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2140.769887] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 390.157s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2140.770082] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] During sync_power_state the instance has a pending task (spawning). Skip. [ 2140.770253] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2140.770844] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 374.814s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2140.771081] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2140.771284] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2140.771445] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2140.773148] env[67820]: INFO nova.compute.manager [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Terminating instance [ 2140.774667] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquiring lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2140.774793] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Acquired lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2140.774954] env[67820]: DEBUG nova.network.neutron [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2140.783361] env[67820]: DEBUG nova.compute.manager [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2140.802022] env[67820]: DEBUG nova.network.neutron [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2140.841763] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2140.842056] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2140.843932] env[67820]: INFO nova.compute.claims [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2140.860180] env[67820]: DEBUG nova.network.neutron [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2140.868072] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Releasing lock "refresh_cache-1834f1ac-f85c-4176-b3c3-e85d50561b4a" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2140.868437] env[67820]: DEBUG nova.compute.manager [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2140.868623] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2140.869110] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-78be3edc-4cad-4f2c-80bd-50b9da87f1a5 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.877839] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a538004d-9fa5-4f17-a0d3-e2c47217cf22 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2140.907920] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1834f1ac-f85c-4176-b3c3-e85d50561b4a could not be found. [ 2140.908172] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2140.908297] env[67820]: INFO nova.compute.manager [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2140.908529] env[67820]: DEBUG oslo.service.loopingcall [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2140.908761] env[67820]: DEBUG nova.compute.manager [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2140.908855] env[67820]: DEBUG nova.network.neutron [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2140.928316] env[67820]: DEBUG nova.network.neutron [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2140.935818] env[67820]: DEBUG nova.network.neutron [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2140.945570] env[67820]: INFO nova.compute.manager [-] [instance: 1834f1ac-f85c-4176-b3c3-e85d50561b4a] Took 0.04 seconds to deallocate network for instance. [ 2141.034541] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1ef94f3e-4c46-4230-a215-ee3f5d721e2f tempest-ServersAaction247Test-1317051121 tempest-ServersAaction247Test-1317051121-project-member] Lock "1834f1ac-f85c-4176-b3c3-e85d50561b4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.264s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2141.038283] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61d25f93-c1a4-483c-a4b8-22bd8cd0cc3b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.046152] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb119d06-ddbe-42c0-99e9-ff9281c8694e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.077114] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a39f8783-7db6-4dd0-b747-2a8b37253e79 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.084035] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31f939ed-9d5d-4277-95a8-e426b2b05e37 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.097612] env[67820]: DEBUG nova.compute.provider_tree [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2141.106338] env[67820]: DEBUG nova.scheduler.client.report [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2141.120989] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2141.121475] env[67820]: DEBUG nova.compute.manager [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2141.158606] env[67820]: DEBUG nova.compute.utils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2141.160081] env[67820]: DEBUG nova.compute.manager [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2141.160272] env[67820]: DEBUG nova.network.neutron [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2141.169943] env[67820]: DEBUG nova.compute.manager [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2141.214094] env[67820]: DEBUG nova.policy [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5af508bde8847228f40888783142106', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ccf8fe5def284576a660bd7505892bde', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 2141.232864] env[67820]: DEBUG nova.compute.manager [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2141.257499] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2141.257907] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2141.258136] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2141.258331] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2141.258478] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2141.258623] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2141.258829] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2141.258988] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2141.259167] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2141.259327] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2141.259494] env[67820]: DEBUG nova.virt.hardware [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2141.260437] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e51d5146-ddec-4d67-bfc3-c241c28891d9 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.267799] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecd552e4-c1f3-4ed3-b6ed-25c9d2946c18 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.626265] env[67820]: DEBUG nova.network.neutron [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Successfully created port: c02d10fc-56da-4833-ba1b-5b836161f80c {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2142.158133] env[67820]: DEBUG nova.compute.manager [req-ea76155c-ac68-4d7e-b4f3-e057950455f9 req-957aeb85-ccb1-4691-a08c-fa9c2fe65da5 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Received event network-vif-plugged-c02d10fc-56da-4833-ba1b-5b836161f80c {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2142.158133] env[67820]: DEBUG oslo_concurrency.lockutils [req-ea76155c-ac68-4d7e-b4f3-e057950455f9 req-957aeb85-ccb1-4691-a08c-fa9c2fe65da5 service nova] Acquiring lock "faa3fbe8-d076-422d-98ba-bfde42fb0580-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2142.158133] env[67820]: DEBUG oslo_concurrency.lockutils [req-ea76155c-ac68-4d7e-b4f3-e057950455f9 req-957aeb85-ccb1-4691-a08c-fa9c2fe65da5 service nova] Lock "faa3fbe8-d076-422d-98ba-bfde42fb0580-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2142.158133] env[67820]: DEBUG oslo_concurrency.lockutils [req-ea76155c-ac68-4d7e-b4f3-e057950455f9 req-957aeb85-ccb1-4691-a08c-fa9c2fe65da5 service nova] Lock "faa3fbe8-d076-422d-98ba-bfde42fb0580-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2142.158133] env[67820]: DEBUG nova.compute.manager [req-ea76155c-ac68-4d7e-b4f3-e057950455f9 req-957aeb85-ccb1-4691-a08c-fa9c2fe65da5 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] No waiting events found dispatching network-vif-plugged-c02d10fc-56da-4833-ba1b-5b836161f80c {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2142.158133] env[67820]: WARNING nova.compute.manager [req-ea76155c-ac68-4d7e-b4f3-e057950455f9 req-957aeb85-ccb1-4691-a08c-fa9c2fe65da5 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Received unexpected event network-vif-plugged-c02d10fc-56da-4833-ba1b-5b836161f80c for instance with vm_state building and task_state spawning. [ 2142.275960] env[67820]: DEBUG nova.network.neutron [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Successfully updated port: c02d10fc-56da-4833-ba1b-5b836161f80c {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2142.286564] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "refresh_cache-faa3fbe8-d076-422d-98ba-bfde42fb0580" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2142.286564] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquired lock "refresh_cache-faa3fbe8-d076-422d-98ba-bfde42fb0580" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2142.286564] env[67820]: DEBUG nova.network.neutron [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2142.342094] env[67820]: DEBUG nova.network.neutron [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2142.512134] env[67820]: DEBUG nova.network.neutron [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Updating instance_info_cache with network_info: [{"id": "c02d10fc-56da-4833-ba1b-5b836161f80c", "address": "fa:16:3e:35:30:53", "network": {"id": "ee19f382-35ef-4070-a391-5060f8bc0bd5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1134714078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ccf8fe5def284576a660bd7505892bde", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4aa1eda7-48b9-4fa2-af0b-94c718313af2", "external-id": "nsx-vlan-transportzone-502", "segmentation_id": 502, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc02d10fc-56", "ovs_interfaceid": "c02d10fc-56da-4833-ba1b-5b836161f80c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2142.524335] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Releasing lock "refresh_cache-faa3fbe8-d076-422d-98ba-bfde42fb0580" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2142.524605] env[67820]: DEBUG nova.compute.manager [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Instance network_info: |[{"id": "c02d10fc-56da-4833-ba1b-5b836161f80c", "address": "fa:16:3e:35:30:53", "network": {"id": "ee19f382-35ef-4070-a391-5060f8bc0bd5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1134714078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ccf8fe5def284576a660bd7505892bde", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4aa1eda7-48b9-4fa2-af0b-94c718313af2", "external-id": "nsx-vlan-transportzone-502", "segmentation_id": 502, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc02d10fc-56", "ovs_interfaceid": "c02d10fc-56da-4833-ba1b-5b836161f80c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2142.524976] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:35:30:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4aa1eda7-48b9-4fa2-af0b-94c718313af2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c02d10fc-56da-4833-ba1b-5b836161f80c', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2142.532557] env[67820]: DEBUG oslo.service.loopingcall [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2142.533017] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2142.533259] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5a1771cd-859f-4082-91de-a2895c6b4ce2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2142.554752] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2142.554752] env[67820]: value = "task-3467508" [ 2142.554752] env[67820]: _type = "Task" [ 2142.554752] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2142.563971] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467508, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2143.065495] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467508, 'name': CreateVM_Task, 'duration_secs': 0.285486} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2143.065668] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2143.066364] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2143.066533] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2143.066857] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2143.067127] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-676b167f-1580-4893-b8c1-57ff254e9cd4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.071472] env[67820]: DEBUG oslo_vmware.api [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for the task: (returnval){ [ 2143.071472] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d42c7e-4d49-e418-bd6d-4479dbc75b85" [ 2143.071472] env[67820]: _type = "Task" [ 2143.071472] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2143.078902] env[67820]: DEBUG oslo_vmware.api [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d42c7e-4d49-e418-bd6d-4479dbc75b85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2143.582327] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2143.582793] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2143.582793] env[67820]: DEBUG oslo_concurrency.lockutils [None req-51a92e98-f07d-4637-9356-d4bb41f497e4 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2144.184865] env[67820]: DEBUG nova.compute.manager [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Received event network-changed-c02d10fc-56da-4833-ba1b-5b836161f80c {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2144.185290] env[67820]: DEBUG nova.compute.manager [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Refreshing instance network info cache due to event network-changed-c02d10fc-56da-4833-ba1b-5b836161f80c. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2144.185290] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] Acquiring lock "refresh_cache-faa3fbe8-d076-422d-98ba-bfde42fb0580" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2144.185398] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] Acquired lock "refresh_cache-faa3fbe8-d076-422d-98ba-bfde42fb0580" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2144.185556] env[67820]: DEBUG nova.network.neutron [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Refreshing network info cache for port c02d10fc-56da-4833-ba1b-5b836161f80c {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2144.416276] env[67820]: DEBUG nova.network.neutron [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Updated VIF entry in instance network info cache for port c02d10fc-56da-4833-ba1b-5b836161f80c. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2144.416655] env[67820]: DEBUG nova.network.neutron [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Updating instance_info_cache with network_info: [{"id": "c02d10fc-56da-4833-ba1b-5b836161f80c", "address": "fa:16:3e:35:30:53", "network": {"id": "ee19f382-35ef-4070-a391-5060f8bc0bd5", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1134714078-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ccf8fe5def284576a660bd7505892bde", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4aa1eda7-48b9-4fa2-af0b-94c718313af2", "external-id": "nsx-vlan-transportzone-502", "segmentation_id": 502, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc02d10fc-56", "ovs_interfaceid": "c02d10fc-56da-4833-ba1b-5b836161f80c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2144.426653] env[67820]: DEBUG oslo_concurrency.lockutils [req-7e892aeb-808f-4421-b2f9-d0e2e9a525f5 req-92660511-cfaf-416b-847f-ba058dc454a8 service nova] Releasing lock "refresh_cache-faa3fbe8-d076-422d-98ba-bfde42fb0580" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2158.622183] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2158.813551] env[67820]: DEBUG oslo_concurrency.lockutils [None req-acb68118-8ef9-40c8-a5b9-bababd7ff1ff tempest-ServerShowV254Test-388012704 tempest-ServerShowV254Test-388012704-project-member] Acquiring lock "eda6fcc3-b964-4728-a2e2-ece044b0ffa2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2159.621795] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2159.622024] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2163.622578] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2163.622939] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2163.622939] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2163.648353] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.648559] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.648633] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.648753] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.648875] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.648995] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.649127] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.649246] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.649362] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.649478] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2163.649596] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2164.621642] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2165.621406] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2166.621599] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2166.642878] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2166.643129] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2166.643300] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2166.643457] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2166.644585] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dc8acda-625d-470b-b6d7-d7e1d5754791 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.653335] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bb96500-f7b8-403d-8529-da1d6342ceda {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.667230] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a6c2ab-d685-44be-9ce0-761c7f65c63f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.673434] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f69b73cb-ad9b-44c5-8d26-55bbb3ce8d31 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.701732] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180941MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2166.701872] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2166.702081] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2166.794086] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.794267] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.794397] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.794519] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.794637] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.794753] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.794866] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.794979] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.795104] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eda6fcc3-b964-4728-a2e2-ece044b0ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.795219] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance faa3fbe8-d076-422d-98ba-bfde42fb0580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2166.806404] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a3a11e77-9a46-442d-84d0-09f08acbfc64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2166.806622] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2166.806767] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2166.929272] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d4debdd-b1dc-4820-a2e9-4b05d13cafec {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.936883] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db62f96a-decf-43cb-8547-800eafe88951 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.967375] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c9ed62b-f674-4171-97ce-e54bde6a1d1a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.974127] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d64a885-bcbe-49e4-8c63-209b0a86a17b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.986704] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2166.994800] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2167.026310] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2167.026493] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.021680] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2169.022075] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2170.621619] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2186.096052] env[67820]: WARNING oslo_vmware.rw_handles [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2186.096052] env[67820]: ERROR oslo_vmware.rw_handles [ 2186.096797] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2186.098777] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2186.099044] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Copying Virtual Disk [datastore1] vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/508a7faf-06a7-411c-9e46-aa44b4b4cc7d/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2186.099366] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8ce362ee-07da-432c-8544-99b5d67f4aae {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.108466] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Waiting for the task: (returnval){ [ 2186.108466] env[67820]: value = "task-3467509" [ 2186.108466] env[67820]: _type = "Task" [ 2186.108466] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2186.117230] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Task: {'id': task-3467509, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2186.618824] env[67820]: DEBUG oslo_vmware.exceptions [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2186.619138] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2186.619681] env[67820]: ERROR nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2186.619681] env[67820]: Faults: ['InvalidArgument'] [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Traceback (most recent call last): [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] yield resources [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self.driver.spawn(context, instance, image_meta, [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self._fetch_image_if_missing(context, vi) [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] image_cache(vi, tmp_image_ds_loc) [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] vm_util.copy_virtual_disk( [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] session._wait_for_task(vmdk_copy_task) [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] return self.wait_for_task(task_ref) [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] return evt.wait() [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] result = hub.switch() [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] return self.greenlet.switch() [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self.f(*self.args, **self.kw) [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] raise exceptions.translate_fault(task_info.error) [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Faults: ['InvalidArgument'] [ 2186.619681] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] [ 2186.620629] env[67820]: INFO nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Terminating instance [ 2186.621589] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2186.622481] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2186.622481] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f278f41-3a15-455e-8e57-09392bf79910 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.624304] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2186.624495] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2186.625260] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-373dba29-99f2-4515-b1b4-30eb90161eb4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.631893] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2186.632185] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f8e241c6-d20a-4641-9121-95f5256c688a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.634339] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2186.634508] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2186.635468] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4b5c89b9-f393-4a76-83f9-ffc4a5152914 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.640412] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 2186.640412] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]521d53cd-3135-2e26-9974-b3e5e0b185f3" [ 2186.640412] env[67820]: _type = "Task" [ 2186.640412] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2186.650682] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]521d53cd-3135-2e26-9974-b3e5e0b185f3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2186.703671] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2186.703880] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2186.704078] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Deleting the datastore file [datastore1] 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2186.704384] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-70b23587-c027-479a-99c6-2e246466550b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.711044] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Waiting for the task: (returnval){ [ 2186.711044] env[67820]: value = "task-3467511" [ 2186.711044] env[67820]: _type = "Task" [ 2186.711044] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2186.719365] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Task: {'id': task-3467511, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2187.150912] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2187.151208] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating directory with path [datastore1] vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2187.151441] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6172055a-237c-43b0-8417-f6ee181d3c1b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.163034] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Created directory with path [datastore1] vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2187.163296] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Fetch image to [datastore1] vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2187.163480] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2187.164245] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49de465b-7e6e-4fb5-83af-bf687b249515 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.171365] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4543aad5-a01c-4e72-8636-16cd327c30ab {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.180280] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ea5f187-68d1-400d-bc88-837aa0409686 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.211123] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00e4413c-969b-475b-9509-2bfcc0634f9a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.222409] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-33137311-8bcd-4ade-b3d1-98d982371013 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.224116] env[67820]: DEBUG oslo_vmware.api [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Task: {'id': task-3467511, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079233} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2187.224353] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2187.224531] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2187.224701] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2187.225122] env[67820]: INFO nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2187.227094] env[67820]: DEBUG nova.compute.claims [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2187.227297] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2187.227534] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2187.244349] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2187.310919] env[67820]: DEBUG oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2187.371774] env[67820]: DEBUG oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2187.372017] env[67820]: DEBUG oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2187.464845] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eb44612-3f3f-4fe5-8c7b-b48807154cbe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.472690] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9915649-5903-416e-9a1f-2cfbb659c831 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.502823] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faa166d8-6b8e-47c0-b289-9ecb5d44183d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.510553] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ac16079-4730-4c2d-ad41-5900a0e397fd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.523988] env[67820]: DEBUG nova.compute.provider_tree [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2187.532699] env[67820]: DEBUG nova.scheduler.client.report [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2187.546841] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.319s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2187.547365] env[67820]: ERROR nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2187.547365] env[67820]: Faults: ['InvalidArgument'] [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Traceback (most recent call last): [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self.driver.spawn(context, instance, image_meta, [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self._fetch_image_if_missing(context, vi) [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] image_cache(vi, tmp_image_ds_loc) [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] vm_util.copy_virtual_disk( [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] session._wait_for_task(vmdk_copy_task) [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] return self.wait_for_task(task_ref) [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] return evt.wait() [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] result = hub.switch() [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] return self.greenlet.switch() [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] self.f(*self.args, **self.kw) [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] raise exceptions.translate_fault(task_info.error) [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Faults: ['InvalidArgument'] [ 2187.547365] env[67820]: ERROR nova.compute.manager [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] [ 2187.548583] env[67820]: DEBUG nova.compute.utils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2187.549388] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Build of instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 was re-scheduled: A specified parameter was not correct: fileType [ 2187.549388] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2187.549761] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2187.549931] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2187.550109] env[67820]: DEBUG nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2187.550272] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2187.938266] env[67820]: DEBUG nova.network.neutron [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2187.950996] env[67820]: INFO nova.compute.manager [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Took 0.40 seconds to deallocate network for instance. [ 2188.041275] env[67820]: INFO nova.scheduler.client.report [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Deleted allocations for instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 [ 2188.085826] env[67820]: DEBUG oslo_concurrency.lockutils [None req-fab21fc9-e854-40a0-a1d0-75fc4734dd46 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 598.399s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.087743] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 437.475s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.088207] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] During sync_power_state the instance has a pending task (spawning). Skip. [ 2188.088207] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.088643] env[67820]: DEBUG oslo_concurrency.lockutils [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 403.135s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.088866] env[67820]: DEBUG oslo_concurrency.lockutils [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Acquiring lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2188.089077] env[67820]: DEBUG oslo_concurrency.lockutils [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.089245] env[67820]: DEBUG oslo_concurrency.lockutils [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.091383] env[67820]: INFO nova.compute.manager [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Terminating instance [ 2188.093256] env[67820]: DEBUG nova.compute.manager [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2188.093445] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2188.093694] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-49506ad1-f6df-4e19-9d64-ba7949fbb667 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.096995] env[67820]: DEBUG nova.compute.manager [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2188.106540] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31853f26-cb2a-420b-bdd3-501df7da3516 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.137547] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1694799a-76d6-4e3e-83e1-5e2e4ad486d4 could not be found. [ 2188.137730] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2188.137880] env[67820]: INFO nova.compute.manager [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2188.138130] env[67820]: DEBUG oslo.service.loopingcall [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2188.138957] env[67820]: DEBUG nova.compute.manager [-] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2188.139070] env[67820]: DEBUG nova.network.neutron [-] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2188.155811] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2188.156078] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2188.157544] env[67820]: INFO nova.compute.claims [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2188.167070] env[67820]: DEBUG nova.network.neutron [-] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2188.177812] env[67820]: INFO nova.compute.manager [-] [instance: 1694799a-76d6-4e3e-83e1-5e2e4ad486d4] Took 0.04 seconds to deallocate network for instance. [ 2188.262863] env[67820]: DEBUG oslo_concurrency.lockutils [None req-328d2334-b01f-4803-9de4-0ceecf199f64 tempest-SecurityGroupsTestJSON-278594785 tempest-SecurityGroupsTestJSON-278594785-project-member] Lock "1694799a-76d6-4e3e-83e1-5e2e4ad486d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.328602] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1455a5d-f008-4f31-8069-04c273fac7de {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.336442] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3aab9d7-833d-40b7-bbce-1e2a331d51ee {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.365879] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55054ad4-c340-40a4-8a57-7b3c81dc2444 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.372751] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfc2c8b3-5ffe-49f6-a741-544b46d62230 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.385375] env[67820]: DEBUG nova.compute.provider_tree [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2188.394266] env[67820]: DEBUG nova.scheduler.client.report [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2188.409919] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2188.410555] env[67820]: DEBUG nova.compute.manager [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2188.450307] env[67820]: DEBUG nova.compute.utils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2188.451612] env[67820]: DEBUG nova.compute.manager [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2188.451785] env[67820]: DEBUG nova.network.neutron [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2188.461028] env[67820]: DEBUG nova.compute.manager [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2188.522592] env[67820]: DEBUG nova.compute.manager [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2188.549301] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2188.549544] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2188.549699] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2188.549877] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2188.550034] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2188.550184] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2188.550389] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2188.550559] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2188.550716] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2188.550874] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2188.551058] env[67820]: DEBUG nova.virt.hardware [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2188.551915] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a974c32-8c42-4e82-9e30-6b641c44b683 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.555508] env[67820]: DEBUG nova.policy [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1fdcd371f66742e2b8a56846e91e62aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7767a564247b405b92073629bffda753', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 2188.561869] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88c8e53d-74f6-425a-96c6-e444b66dd242 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2188.903170] env[67820]: DEBUG nova.network.neutron [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Successfully created port: c9a7013e-0880-49ab-ac57-7d59d96a1da6 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2189.704260] env[67820]: DEBUG nova.network.neutron [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Successfully updated port: c9a7013e-0880-49ab-ac57-7d59d96a1da6 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2189.716205] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "refresh_cache-a3a11e77-9a46-442d-84d0-09f08acbfc64" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2189.716360] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "refresh_cache-a3a11e77-9a46-442d-84d0-09f08acbfc64" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2189.716511] env[67820]: DEBUG nova.network.neutron [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2189.779541] env[67820]: DEBUG nova.network.neutron [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2189.932125] env[67820]: DEBUG nova.network.neutron [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Updating instance_info_cache with network_info: [{"id": "c9a7013e-0880-49ab-ac57-7d59d96a1da6", "address": "fa:16:3e:14:58:db", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9a7013e-08", "ovs_interfaceid": "c9a7013e-0880-49ab-ac57-7d59d96a1da6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2189.944654] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "refresh_cache-a3a11e77-9a46-442d-84d0-09f08acbfc64" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2189.944964] env[67820]: DEBUG nova.compute.manager [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Instance network_info: |[{"id": "c9a7013e-0880-49ab-ac57-7d59d96a1da6", "address": "fa:16:3e:14:58:db", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9a7013e-08", "ovs_interfaceid": "c9a7013e-0880-49ab-ac57-7d59d96a1da6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2189.945356] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:14:58:db', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '40c947c4-f471-4d48-8e43-fee54198107e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c9a7013e-0880-49ab-ac57-7d59d96a1da6', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2189.953170] env[67820]: DEBUG oslo.service.loopingcall [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2189.953592] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2189.953826] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-94c95133-509b-42d8-a28b-db36036c2495 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2189.974178] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2189.974178] env[67820]: value = "task-3467512" [ 2189.974178] env[67820]: _type = "Task" [ 2189.974178] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2189.979881] env[67820]: DEBUG nova.compute.manager [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Received event network-vif-plugged-c9a7013e-0880-49ab-ac57-7d59d96a1da6 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2189.979881] env[67820]: DEBUG oslo_concurrency.lockutils [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] Acquiring lock "a3a11e77-9a46-442d-84d0-09f08acbfc64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2189.980219] env[67820]: DEBUG oslo_concurrency.lockutils [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] Lock "a3a11e77-9a46-442d-84d0-09f08acbfc64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2189.980219] env[67820]: DEBUG oslo_concurrency.lockutils [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] Lock "a3a11e77-9a46-442d-84d0-09f08acbfc64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2189.980350] env[67820]: DEBUG nova.compute.manager [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] No waiting events found dispatching network-vif-plugged-c9a7013e-0880-49ab-ac57-7d59d96a1da6 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2189.980417] env[67820]: WARNING nova.compute.manager [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Received unexpected event network-vif-plugged-c9a7013e-0880-49ab-ac57-7d59d96a1da6 for instance with vm_state building and task_state spawning. [ 2189.980600] env[67820]: DEBUG nova.compute.manager [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Received event network-changed-c9a7013e-0880-49ab-ac57-7d59d96a1da6 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2189.980757] env[67820]: DEBUG nova.compute.manager [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Refreshing instance network info cache due to event network-changed-c9a7013e-0880-49ab-ac57-7d59d96a1da6. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2189.980934] env[67820]: DEBUG oslo_concurrency.lockutils [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] Acquiring lock "refresh_cache-a3a11e77-9a46-442d-84d0-09f08acbfc64" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2189.981085] env[67820]: DEBUG oslo_concurrency.lockutils [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] Acquired lock "refresh_cache-a3a11e77-9a46-442d-84d0-09f08acbfc64" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2189.981239] env[67820]: DEBUG nova.network.neutron [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Refreshing network info cache for port c9a7013e-0880-49ab-ac57-7d59d96a1da6 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2189.985498] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467512, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2190.250357] env[67820]: DEBUG nova.network.neutron [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Updated VIF entry in instance network info cache for port c9a7013e-0880-49ab-ac57-7d59d96a1da6. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2190.250812] env[67820]: DEBUG nova.network.neutron [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Updating instance_info_cache with network_info: [{"id": "c9a7013e-0880-49ab-ac57-7d59d96a1da6", "address": "fa:16:3e:14:58:db", "network": {"id": "6030bd31-fc45-48f6-ac49-2de7e66076d5", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-601302177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7767a564247b405b92073629bffda753", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9a7013e-08", "ovs_interfaceid": "c9a7013e-0880-49ab-ac57-7d59d96a1da6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2190.260838] env[67820]: DEBUG oslo_concurrency.lockutils [req-2cb81538-c59d-45c4-936e-b3a79fd29a1d req-55c764f2-c564-48c8-a102-4ef265c41b09 service nova] Releasing lock "refresh_cache-a3a11e77-9a46-442d-84d0-09f08acbfc64" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2190.484954] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467512, 'name': CreateVM_Task, 'duration_secs': 0.307905} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2190.485653] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2190.486292] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2190.486292] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2190.486596] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2190.486777] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-71b0a86f-ca53-412c-8ac2-0e303a856527 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2190.491279] env[67820]: DEBUG oslo_vmware.api [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Waiting for the task: (returnval){ [ 2190.491279] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]525f6d9b-10c8-0793-2d66-cc2b1e527d24" [ 2190.491279] env[67820]: _type = "Task" [ 2190.491279] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2190.498800] env[67820]: DEBUG oslo_vmware.api [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]525f6d9b-10c8-0793-2d66-cc2b1e527d24, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2191.002743] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2191.003103] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2191.003217] env[67820]: DEBUG oslo_concurrency.lockutils [None req-74c45b3a-cd1c-422f-b5cf-556c57fc67b3 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2192.871801] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "d3e993ae-b433-4f41-9692-a90a835fc053" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2192.872130] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "d3e993ae-b433-4f41-9692-a90a835fc053" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2216.616841] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2219.622102] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2221.621638] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2221.621930] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2223.623618] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2223.623964] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2223.623964] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2223.643556] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.643700] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.643846] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.643953] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.644407] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.644578] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.644709] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.644833] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.644954] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.645090] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2223.645211] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2224.621486] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2225.478988] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "e0644c9e-0d5d-4a13-8a26-e99861454d1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2225.479283] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "e0644c9e-0d5d-4a13-8a26-e99861454d1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2225.620680] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2227.621263] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2227.633632] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2227.634127] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2227.634127] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2227.634261] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2227.635438] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6607fc2b-672b-4aed-bac8-e8823850d83e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.644266] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6bdfea4-640a-484a-9719-8ffa15ee7a16 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.658232] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d269dcc4-900c-4078-a683-adbc784e5561 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.664395] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60201257-a205-4948-a0f2-5a676e7f48a8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.694643] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180907MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2227.694788] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2227.694972] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2227.761928] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.762102] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.762281] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.762450] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.762576] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.762697] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.762814] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.762925] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eda6fcc3-b964-4728-a2e2-ece044b0ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.763051] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance faa3fbe8-d076-422d-98ba-bfde42fb0580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.763167] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a3a11e77-9a46-442d-84d0-09f08acbfc64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2227.773402] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3e993ae-b433-4f41-9692-a90a835fc053 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2227.783578] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e0644c9e-0d5d-4a13-8a26-e99861454d1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2227.783777] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2227.783917] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2227.913860] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-811bff47-e812-4f8e-bec9-788c637ac845 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.921503] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f79a20ea-bcb8-4e39-9cee-2729835645f6 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.951738] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-099efefb-8c9d-431e-ac3e-9c0063c96f6d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.958278] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27abfd29-477c-4e8f-b727-6218bf0f8f0e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2227.970638] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2227.979308] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2227.992427] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2227.992606] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2228.988265] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2228.988621] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2230.622296] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2233.432061] env[67820]: WARNING oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2233.432061] env[67820]: ERROR oslo_vmware.rw_handles [ 2233.432061] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2233.433474] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2233.433716] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Copying Virtual Disk [datastore1] vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/0cd4a414-a39f-4a7c-a353-0b3872f0d7c2/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2233.433992] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c3ee1157-f6ef-4869-bb89-4f5c94a765bd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.442369] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 2233.442369] env[67820]: value = "task-3467513" [ 2233.442369] env[67820]: _type = "Task" [ 2233.442369] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2233.450023] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': task-3467513, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2233.952524] env[67820]: DEBUG oslo_vmware.exceptions [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2233.952801] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2233.953347] env[67820]: ERROR nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2233.953347] env[67820]: Faults: ['InvalidArgument'] [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Traceback (most recent call last): [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] yield resources [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self.driver.spawn(context, instance, image_meta, [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self._fetch_image_if_missing(context, vi) [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] image_cache(vi, tmp_image_ds_loc) [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] vm_util.copy_virtual_disk( [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] session._wait_for_task(vmdk_copy_task) [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] return self.wait_for_task(task_ref) [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] return evt.wait() [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] result = hub.switch() [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] return self.greenlet.switch() [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self.f(*self.args, **self.kw) [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] raise exceptions.translate_fault(task_info.error) [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Faults: ['InvalidArgument'] [ 2233.953347] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] [ 2233.954481] env[67820]: INFO nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Terminating instance [ 2233.955197] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2233.955402] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2233.955635] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dd7b7763-be16-476e-a397-52577752ebfe {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.958615] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2233.958810] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2233.959549] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78209911-5140-4585-b427-843db8ac99b2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.966677] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2233.966934] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-90ab47fe-763e-4797-b683-87e6d7ca6b4d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.968999] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2233.969189] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2233.970146] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f4076ab3-f0b1-4817-a530-17486861195f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2233.974606] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 2233.974606] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]522f8a5b-5d83-4e96-238c-52916370722c" [ 2233.974606] env[67820]: _type = "Task" [ 2233.974606] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2233.981607] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]522f8a5b-5d83-4e96-238c-52916370722c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2234.030384] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2234.030581] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2234.030793] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Deleting the datastore file [datastore1] c29a702f-67df-47d3-84ed-0cbd3b430c48 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2234.031099] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d819ac19-f8cf-4296-9f84-1ac217e4c3be {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.036673] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 2234.036673] env[67820]: value = "task-3467515" [ 2234.036673] env[67820]: _type = "Task" [ 2234.036673] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2234.045036] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': task-3467515, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2234.484966] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2234.485359] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating directory with path [datastore1] vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2234.485476] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ff279e8-7101-46ee-a101-1556d9e66709 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.496646] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Created directory with path [datastore1] vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2234.496833] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Fetch image to [datastore1] vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2234.496986] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2234.497699] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab10c126-68c6-4ad1-9979-3fc797a37d89 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.504277] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-210e4aee-583e-4402-9f89-540609051c8c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.513199] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83b47d70-8d5f-4079-ad41-a53a6c8af19f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.547500] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27edbd88-89c8-4db8-a5c7-9ddcefa28ad1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.554086] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': task-3467515, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081795} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2234.555479] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2234.555668] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2234.555836] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2234.556013] env[67820]: INFO nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2234.557704] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2f387ca3-eefe-4c4d-898b-bf36f2d0ee38 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.559502] env[67820]: DEBUG nova.compute.claims [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2234.559671] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2234.559875] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2234.581038] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2234.631113] env[67820]: DEBUG oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2234.690652] env[67820]: DEBUG oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2234.690858] env[67820]: DEBUG oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2234.795145] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66afd1a4-8c43-4f27-9b84-47a7ec09a37d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.803095] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eb89d31-661f-466b-8973-7a4800b6e6d3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.832464] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8679c194-c6ff-4843-9f45-b3cae683877a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.839499] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3eb30c3-1361-4935-83b7-376af1e95d0d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.853547] env[67820]: DEBUG nova.compute.provider_tree [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2234.863011] env[67820]: DEBUG nova.scheduler.client.report [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2234.879255] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.319s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2234.879788] env[67820]: ERROR nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2234.879788] env[67820]: Faults: ['InvalidArgument'] [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Traceback (most recent call last): [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self.driver.spawn(context, instance, image_meta, [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self._fetch_image_if_missing(context, vi) [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] image_cache(vi, tmp_image_ds_loc) [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] vm_util.copy_virtual_disk( [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] session._wait_for_task(vmdk_copy_task) [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] return self.wait_for_task(task_ref) [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] return evt.wait() [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] result = hub.switch() [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] return self.greenlet.switch() [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] self.f(*self.args, **self.kw) [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] raise exceptions.translate_fault(task_info.error) [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Faults: ['InvalidArgument'] [ 2234.879788] env[67820]: ERROR nova.compute.manager [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] [ 2234.880780] env[67820]: DEBUG nova.compute.utils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2234.881973] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Build of instance c29a702f-67df-47d3-84ed-0cbd3b430c48 was re-scheduled: A specified parameter was not correct: fileType [ 2234.881973] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2234.882349] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2234.882521] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2234.882743] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2234.883021] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2235.212524] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2235.225102] env[67820]: INFO nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Took 0.34 seconds to deallocate network for instance. [ 2235.318926] env[67820]: INFO nova.scheduler.client.report [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Deleted allocations for instance c29a702f-67df-47d3-84ed-0cbd3b430c48 [ 2235.338949] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 622.178s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2235.340566] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 484.728s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2235.340892] env[67820]: INFO nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] During sync_power_state the instance has a pending task (spawning). Skip. [ 2235.341211] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2235.341879] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 426.266s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2235.342128] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "c29a702f-67df-47d3-84ed-0cbd3b430c48-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2235.342339] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2235.342504] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2235.344363] env[67820]: INFO nova.compute.manager [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Terminating instance [ 2235.346054] env[67820]: DEBUG nova.compute.manager [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2235.346246] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2235.346494] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fe56b6b2-bd53-44fc-b4a4-3ec934bf7506 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.351023] env[67820]: DEBUG nova.compute.manager [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2235.357285] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01b8d56-bb0f-4c17-8963-1c871f961735 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.386551] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c29a702f-67df-47d3-84ed-0cbd3b430c48 could not be found. [ 2235.386689] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2235.386791] env[67820]: INFO nova.compute.manager [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2235.387041] env[67820]: DEBUG oslo.service.loopingcall [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2235.387750] env[67820]: DEBUG nova.compute.manager [-] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2235.387852] env[67820]: DEBUG nova.network.neutron [-] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2235.405017] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2235.405017] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2235.405846] env[67820]: INFO nova.compute.claims [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2235.421791] env[67820]: DEBUG nova.network.neutron [-] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2235.445561] env[67820]: INFO nova.compute.manager [-] [instance: c29a702f-67df-47d3-84ed-0cbd3b430c48] Took 0.06 seconds to deallocate network for instance. [ 2235.541337] env[67820]: DEBUG oslo_concurrency.lockutils [None req-9e70902b-486c-4613-92a9-eb16d5b97996 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "c29a702f-67df-47d3-84ed-0cbd3b430c48" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2235.592024] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcd1683b-ee9e-44ea-89d6-eb4a0af25745 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.599671] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5217a958-a534-43a1-a5ec-69307fcd51d3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.629486] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98e7e429-ca83-4cd3-8f98-8ac31bd5fdae {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.636463] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a11c5026-24d8-41af-8674-6baebee943a0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.649430] env[67820]: DEBUG nova.compute.provider_tree [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2235.658167] env[67820]: DEBUG nova.scheduler.client.report [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2235.670614] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2235.671073] env[67820]: DEBUG nova.compute.manager [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2235.701993] env[67820]: DEBUG nova.compute.utils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2235.703333] env[67820]: DEBUG nova.compute.manager [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2235.703515] env[67820]: DEBUG nova.network.neutron [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2235.710540] env[67820]: DEBUG nova.compute.manager [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2235.770056] env[67820]: DEBUG nova.policy [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'df43615850404e60b571c2ab5296519c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e17152dd1ce04f3dbcb729e8315f0006', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 2235.774243] env[67820]: DEBUG nova.compute.manager [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2235.799378] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2235.799638] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2235.799877] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2235.800094] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2235.800247] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2235.800394] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2235.800597] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2235.800751] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2235.800913] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2235.801135] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2235.801259] env[67820]: DEBUG nova.virt.hardware [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2235.802119] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fb0fed9-6ebc-4afd-9671-f59f94ce481e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.811431] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a1c74e8-7c46-4501-826c-b1c4da618927 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2236.179643] env[67820]: DEBUG nova.network.neutron [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Successfully created port: 4f1363de-14ac-409b-a415-c7782f5acf8c {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2236.962526] env[67820]: DEBUG nova.network.neutron [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Successfully updated port: 4f1363de-14ac-409b-a415-c7782f5acf8c {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2236.975966] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "refresh_cache-d3e993ae-b433-4f41-9692-a90a835fc053" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2236.975966] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired lock "refresh_cache-d3e993ae-b433-4f41-9692-a90a835fc053" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2236.975966] env[67820]: DEBUG nova.network.neutron [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2237.016800] env[67820]: DEBUG nova.network.neutron [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2237.271838] env[67820]: DEBUG nova.network.neutron [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Updating instance_info_cache with network_info: [{"id": "4f1363de-14ac-409b-a415-c7782f5acf8c", "address": "fa:16:3e:8c:ec:55", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f1363de-14", "ovs_interfaceid": "4f1363de-14ac-409b-a415-c7782f5acf8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2237.286117] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Releasing lock "refresh_cache-d3e993ae-b433-4f41-9692-a90a835fc053" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2237.286428] env[67820]: DEBUG nova.compute.manager [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Instance network_info: |[{"id": "4f1363de-14ac-409b-a415-c7782f5acf8c", "address": "fa:16:3e:8c:ec:55", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f1363de-14", "ovs_interfaceid": "4f1363de-14ac-409b-a415-c7782f5acf8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2237.286907] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8c:ec:55', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '418ddd3d-5f64-407e-8e0c-c8b81639bee9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4f1363de-14ac-409b-a415-c7782f5acf8c', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2237.294626] env[67820]: DEBUG oslo.service.loopingcall [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2237.295968] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2237.297048] env[67820]: DEBUG nova.compute.manager [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Received event network-vif-plugged-4f1363de-14ac-409b-a415-c7782f5acf8c {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2237.297253] env[67820]: DEBUG oslo_concurrency.lockutils [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] Acquiring lock "d3e993ae-b433-4f41-9692-a90a835fc053-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2237.297452] env[67820]: DEBUG oslo_concurrency.lockutils [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] Lock "d3e993ae-b433-4f41-9692-a90a835fc053-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2237.297617] env[67820]: DEBUG oslo_concurrency.lockutils [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] Lock "d3e993ae-b433-4f41-9692-a90a835fc053-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2237.297777] env[67820]: DEBUG nova.compute.manager [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] No waiting events found dispatching network-vif-plugged-4f1363de-14ac-409b-a415-c7782f5acf8c {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2237.297931] env[67820]: WARNING nova.compute.manager [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Received unexpected event network-vif-plugged-4f1363de-14ac-409b-a415-c7782f5acf8c for instance with vm_state building and task_state spawning. [ 2237.298098] env[67820]: DEBUG nova.compute.manager [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Received event network-changed-4f1363de-14ac-409b-a415-c7782f5acf8c {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2237.298252] env[67820]: DEBUG nova.compute.manager [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Refreshing instance network info cache due to event network-changed-4f1363de-14ac-409b-a415-c7782f5acf8c. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2237.298418] env[67820]: DEBUG oslo_concurrency.lockutils [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] Acquiring lock "refresh_cache-d3e993ae-b433-4f41-9692-a90a835fc053" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2237.298551] env[67820]: DEBUG oslo_concurrency.lockutils [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] Acquired lock "refresh_cache-d3e993ae-b433-4f41-9692-a90a835fc053" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2237.298701] env[67820]: DEBUG nova.network.neutron [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Refreshing network info cache for port 4f1363de-14ac-409b-a415-c7782f5acf8c {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2237.300102] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a193b486-5c95-4a5a-804d-8328d37e73ce {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2237.323743] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2237.323743] env[67820]: value = "task-3467516" [ 2237.323743] env[67820]: _type = "Task" [ 2237.323743] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2237.332895] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467516, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2237.602437] env[67820]: DEBUG nova.network.neutron [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Updated VIF entry in instance network info cache for port 4f1363de-14ac-409b-a415-c7782f5acf8c. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2237.602831] env[67820]: DEBUG nova.network.neutron [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Updating instance_info_cache with network_info: [{"id": "4f1363de-14ac-409b-a415-c7782f5acf8c", "address": "fa:16:3e:8c:ec:55", "network": {"id": "9fdf0b7d-999a-4e03-997d-62f0dc27cafa", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-956929255-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e17152dd1ce04f3dbcb729e8315f0006", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f1363de-14", "ovs_interfaceid": "4f1363de-14ac-409b-a415-c7782f5acf8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2237.611843] env[67820]: DEBUG oslo_concurrency.lockutils [req-5caafb09-7053-4d51-a98d-fc0fb55dcb02 req-277698f9-edbd-4906-b1b0-894af24723f5 service nova] Releasing lock "refresh_cache-d3e993ae-b433-4f41-9692-a90a835fc053" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2237.834233] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467516, 'name': CreateVM_Task, 'duration_secs': 0.275482} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2237.834395] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2237.835055] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2237.835230] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2237.835535] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2237.835776] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-61ed212a-6951-4512-b055-48d0ab7fa90c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2237.839938] env[67820]: DEBUG oslo_vmware.api [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Waiting for the task: (returnval){ [ 2237.839938] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52cfbc13-ce39-b25c-853b-ad00a9800d61" [ 2237.839938] env[67820]: _type = "Task" [ 2237.839938] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2237.851730] env[67820]: DEBUG oslo_vmware.api [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52cfbc13-ce39-b25c-853b-ad00a9800d61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2238.352126] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2238.352521] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2238.352649] env[67820]: DEBUG oslo_concurrency.lockutils [None req-1b349197-7fe7-4567-8a3f-633224592317 tempest-AttachVolumeNegativeTest-1899229334 tempest-AttachVolumeNegativeTest-1899229334-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2250.657146] env[67820]: DEBUG oslo_concurrency.lockutils [None req-e68f09b5-457f-4e51-b11e-2bee3470d748 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "faa3fbe8-d076-422d-98ba-bfde42fb0580" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2281.623777] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2283.450108] env[67820]: WARNING oslo_vmware.rw_handles [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2283.450108] env[67820]: ERROR oslo_vmware.rw_handles [ 2283.450717] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2283.452730] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2283.452958] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Copying Virtual Disk [datastore1] vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/bc8a9830-a265-4d6e-80ad-c08ba8165ecc/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2283.453264] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4ac681f4-c83e-44b4-a0a8-936f2a5569c0 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2283.461231] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 2283.461231] env[67820]: value = "task-3467517" [ 2283.461231] env[67820]: _type = "Task" [ 2283.461231] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2283.468890] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': task-3467517, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2283.621674] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2283.621871] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2283.621961] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2283.644289] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.644471] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.644552] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.644676] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.644798] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.644918] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.645054] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.645176] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.645293] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.645410] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2283.645525] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2283.645987] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2283.646140] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2283.971264] env[67820]: DEBUG oslo_vmware.exceptions [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2283.971601] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2283.972184] env[67820]: ERROR nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2283.972184] env[67820]: Faults: ['InvalidArgument'] [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Traceback (most recent call last): [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] yield resources [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self.driver.spawn(context, instance, image_meta, [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self._fetch_image_if_missing(context, vi) [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] image_cache(vi, tmp_image_ds_loc) [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] vm_util.copy_virtual_disk( [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] session._wait_for_task(vmdk_copy_task) [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] return self.wait_for_task(task_ref) [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] return evt.wait() [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] result = hub.switch() [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] return self.greenlet.switch() [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self.f(*self.args, **self.kw) [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] raise exceptions.translate_fault(task_info.error) [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Faults: ['InvalidArgument'] [ 2283.972184] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] [ 2283.973223] env[67820]: INFO nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Terminating instance [ 2283.973946] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2283.974166] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2283.974405] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-82d092f9-0e9d-4d40-b330-81ce8bd62020 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2283.976477] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2283.976661] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2283.977388] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-359a405c-8d75-4213-83c5-6abd5edede69 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2283.984210] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2283.984432] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-64bcf833-51cf-4968-8e79-2f129a31d853 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2283.986521] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2283.986749] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2283.987673] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9aefb220-e80f-4da4-b822-d1c0e115b7c4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2283.992540] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for the task: (returnval){ [ 2283.992540] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]5285bc4c-f42d-6c49-7342-c0eec7498e2d" [ 2283.992540] env[67820]: _type = "Task" [ 2283.992540] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2283.999496] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]5285bc4c-f42d-6c49-7342-c0eec7498e2d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2284.051682] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2284.051908] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2284.052102] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Deleting the datastore file [datastore1] 2965c630-07c6-4e08-a5ab-4996d4c72b82 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2284.052423] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-324ab780-0d4a-4403-98d1-34f5ca8da20a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2284.059157] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for the task: (returnval){ [ 2284.059157] env[67820]: value = "task-3467519" [ 2284.059157] env[67820]: _type = "Task" [ 2284.059157] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2284.066623] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': task-3467519, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2284.503770] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2284.504123] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Creating directory with path [datastore1] vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2284.504310] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c34d4f21-75ef-4cf1-a24b-8abfecba689a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2284.515148] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Created directory with path [datastore1] vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2284.515332] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Fetch image to [datastore1] vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2284.515505] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2284.516206] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d90dccc9-25fd-471d-b0ac-e0dc932d4f16 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2284.522532] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18c06838-84bd-415e-9c90-17eed0c811c3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2284.531485] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-380e4ccb-1c2c-4377-910e-c7e90bc55685 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.279429] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f1655cc-73df-43be-a517-78747be4b4f7 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.290566] env[67820]: DEBUG oslo_vmware.api [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Task: {'id': task-3467519, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07668} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2285.290566] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2285.290566] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2285.290566] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2285.290566] env[67820]: INFO nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Took 1.31 seconds to destroy the instance on the hypervisor. [ 2285.290820] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e59d22c8-fdb0-4854-b856-b26cbfd12c3c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.292667] env[67820]: DEBUG nova.compute.claims [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2285.292827] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2285.293507] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2285.315484] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2285.366787] env[67820]: DEBUG oslo_vmware.rw_handles [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2285.426495] env[67820]: DEBUG oslo_vmware.rw_handles [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2285.426786] env[67820]: DEBUG oslo_vmware.rw_handles [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2285.519945] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0abf76c8-288c-4c6b-b929-65698819533f {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.527576] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c78c35c9-70e8-4721-83d7-0e188f82d862 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.556682] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bc082a3-4d50-4f1e-81a9-f4164b235d68 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.563633] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4760de49-b9fc-4cfc-bee2-41cb3e8d2acd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2285.576926] env[67820]: DEBUG nova.compute.provider_tree [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2285.586145] env[67820]: DEBUG nova.scheduler.client.report [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2285.592219] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "13a5da5b-ac46-43ab-8b34-7aca76c1c059" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2285.592432] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "13a5da5b-ac46-43ab-8b34-7aca76c1c059" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2285.596952] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.304s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2285.597469] env[67820]: ERROR nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2285.597469] env[67820]: Faults: ['InvalidArgument'] [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Traceback (most recent call last): [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self.driver.spawn(context, instance, image_meta, [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self._fetch_image_if_missing(context, vi) [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] image_cache(vi, tmp_image_ds_loc) [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] vm_util.copy_virtual_disk( [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] session._wait_for_task(vmdk_copy_task) [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] return self.wait_for_task(task_ref) [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] return evt.wait() [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] result = hub.switch() [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] return self.greenlet.switch() [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] self.f(*self.args, **self.kw) [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] raise exceptions.translate_fault(task_info.error) [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Faults: ['InvalidArgument'] [ 2285.597469] env[67820]: ERROR nova.compute.manager [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] [ 2285.598378] env[67820]: DEBUG nova.compute.utils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2285.599383] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Build of instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 was re-scheduled: A specified parameter was not correct: fileType [ 2285.599383] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2285.599809] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2285.599979] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2285.600187] env[67820]: DEBUG nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2285.600354] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2285.621585] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2285.911405] env[67820]: DEBUG nova.network.neutron [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2285.923694] env[67820]: INFO nova.compute.manager [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Took 0.32 seconds to deallocate network for instance. [ 2286.023497] env[67820]: INFO nova.scheduler.client.report [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Deleted allocations for instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 [ 2286.042124] env[67820]: DEBUG oslo_concurrency.lockutils [None req-17625399-9b12-47f7-b72a-930300ce390a tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 672.853s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2286.042984] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 477.031s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2286.043234] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Acquiring lock "2965c630-07c6-4e08-a5ab-4996d4c72b82-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2286.043452] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2286.043623] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2286.045835] env[67820]: INFO nova.compute.manager [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Terminating instance [ 2286.047476] env[67820]: DEBUG nova.compute.manager [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2286.047669] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2286.048136] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7150148e-404a-4d0c-ba26-d00060d4b455 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.053369] env[67820]: DEBUG nova.compute.manager [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2286.059714] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03d366ed-d74e-4b68-9618-27b41d89e083 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.089259] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2965c630-07c6-4e08-a5ab-4996d4c72b82 could not be found. [ 2286.089467] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2286.089641] env[67820]: INFO nova.compute.manager [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2286.089878] env[67820]: DEBUG oslo.service.loopingcall [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2286.092104] env[67820]: DEBUG nova.compute.manager [-] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2286.092219] env[67820]: DEBUG nova.network.neutron [-] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2286.105709] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2286.105947] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2286.107378] env[67820]: INFO nova.compute.claims [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2286.118590] env[67820]: DEBUG nova.network.neutron [-] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2286.136603] env[67820]: INFO nova.compute.manager [-] [instance: 2965c630-07c6-4e08-a5ab-4996d4c72b82] Took 0.04 seconds to deallocate network for instance. [ 2286.250674] env[67820]: DEBUG oslo_concurrency.lockutils [None req-4e7683f9-36ac-41b1-9f88-24c4df60b917 tempest-MultipleCreateTestJSON-1391953396 tempest-MultipleCreateTestJSON-1391953396-project-member] Lock "2965c630-07c6-4e08-a5ab-4996d4c72b82" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.208s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2286.305427] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0f7b5cc-1bfc-4615-aac6-a4ab06649f07 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.313052] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dc0f305-bdd2-4e15-8f09-7b5f83076f9a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.342771] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d51d6b09-1c7d-4c57-9776-12568e020424 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.350203] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f938598-0f12-4544-861d-af1795edd0ba {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.363090] env[67820]: DEBUG nova.compute.provider_tree [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2286.371096] env[67820]: DEBUG nova.scheduler.client.report [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2286.384253] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2286.384738] env[67820]: DEBUG nova.compute.manager [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2286.415114] env[67820]: DEBUG nova.compute.utils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2286.416490] env[67820]: DEBUG nova.compute.manager [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2286.416599] env[67820]: DEBUG nova.network.neutron [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2286.428029] env[67820]: DEBUG nova.compute.manager [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2286.491156] env[67820]: DEBUG nova.policy [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8b9868addae45a49b19e7058f737988', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '83044475bfd24b14a5a95b4b3fa0376c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 2286.511509] env[67820]: DEBUG nova.compute.manager [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2286.535716] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2286.536021] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2286.536125] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2286.536301] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2286.536455] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2286.536857] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2286.537112] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2286.537278] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2286.537443] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2286.537601] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2286.537766] env[67820]: DEBUG nova.virt.hardware [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2286.538681] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25f0a3b3-132c-4137-ab02-50761f81e09b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.546870] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06de1a1b-a692-4c13-a3f0-33494ce8c2dc {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2286.620630] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2286.781600] env[67820]: DEBUG nova.network.neutron [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Successfully created port: 0a11299c-028b-4263-94f3-2ed325c36007 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2287.202359] env[67820]: DEBUG oslo_concurrency.lockutils [None req-26ba9787-9123-4346-88c8-8adef7479e75 tempest-ServerDiskConfigTestJSON-231862696 tempest-ServerDiskConfigTestJSON-231862696-project-member] Acquiring lock "a3a11e77-9a46-442d-84d0-09f08acbfc64" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2287.384541] env[67820]: DEBUG nova.network.neutron [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Successfully updated port: 0a11299c-028b-4263-94f3-2ed325c36007 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2287.393965] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "refresh_cache-e0644c9e-0d5d-4a13-8a26-e99861454d1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2287.394300] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired lock "refresh_cache-e0644c9e-0d5d-4a13-8a26-e99861454d1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2287.394565] env[67820]: DEBUG nova.network.neutron [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2287.452104] env[67820]: DEBUG nova.network.neutron [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2287.873862] env[67820]: DEBUG nova.network.neutron [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Updating instance_info_cache with network_info: [{"id": "0a11299c-028b-4263-94f3-2ed325c36007", "address": "fa:16:3e:31:76:1e", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0a11299c-02", "ovs_interfaceid": "0a11299c-028b-4263-94f3-2ed325c36007", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2287.887739] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Releasing lock "refresh_cache-e0644c9e-0d5d-4a13-8a26-e99861454d1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2287.888017] env[67820]: DEBUG nova.compute.manager [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Instance network_info: |[{"id": "0a11299c-028b-4263-94f3-2ed325c36007", "address": "fa:16:3e:31:76:1e", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0a11299c-02", "ovs_interfaceid": "0a11299c-028b-4263-94f3-2ed325c36007", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2287.888397] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:31:76:1e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '365ac5b1-6d83-4dfe-887f-60574d7f6124', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0a11299c-028b-4263-94f3-2ed325c36007', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2287.896105] env[67820]: DEBUG oslo.service.loopingcall [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2287.896663] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2287.896895] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d073d4a8-5cd1-40b8-8ded-e368e747258e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2287.916810] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2287.916810] env[67820]: value = "task-3467520" [ 2287.916810] env[67820]: _type = "Task" [ 2287.916810] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2287.924405] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467520, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2287.948751] env[67820]: DEBUG nova.compute.manager [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Received event network-vif-plugged-0a11299c-028b-4263-94f3-2ed325c36007 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2287.948751] env[67820]: DEBUG oslo_concurrency.lockutils [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] Acquiring lock "e0644c9e-0d5d-4a13-8a26-e99861454d1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2287.948751] env[67820]: DEBUG oslo_concurrency.lockutils [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] Lock "e0644c9e-0d5d-4a13-8a26-e99861454d1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2287.948751] env[67820]: DEBUG oslo_concurrency.lockutils [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] Lock "e0644c9e-0d5d-4a13-8a26-e99861454d1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2287.948751] env[67820]: DEBUG nova.compute.manager [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] No waiting events found dispatching network-vif-plugged-0a11299c-028b-4263-94f3-2ed325c36007 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2287.948751] env[67820]: WARNING nova.compute.manager [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Received unexpected event network-vif-plugged-0a11299c-028b-4263-94f3-2ed325c36007 for instance with vm_state building and task_state spawning. [ 2287.948751] env[67820]: DEBUG nova.compute.manager [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Received event network-changed-0a11299c-028b-4263-94f3-2ed325c36007 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2287.948751] env[67820]: DEBUG nova.compute.manager [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Refreshing instance network info cache due to event network-changed-0a11299c-028b-4263-94f3-2ed325c36007. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2287.949031] env[67820]: DEBUG oslo_concurrency.lockutils [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] Acquiring lock "refresh_cache-e0644c9e-0d5d-4a13-8a26-e99861454d1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2287.949031] env[67820]: DEBUG oslo_concurrency.lockutils [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] Acquired lock "refresh_cache-e0644c9e-0d5d-4a13-8a26-e99861454d1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2287.949307] env[67820]: DEBUG nova.network.neutron [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Refreshing network info cache for port 0a11299c-028b-4263-94f3-2ed325c36007 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2288.188979] env[67820]: DEBUG nova.network.neutron [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Updated VIF entry in instance network info cache for port 0a11299c-028b-4263-94f3-2ed325c36007. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2288.189440] env[67820]: DEBUG nova.network.neutron [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Updating instance_info_cache with network_info: [{"id": "0a11299c-028b-4263-94f3-2ed325c36007", "address": "fa:16:3e:31:76:1e", "network": {"id": "1e7c9b4f-3d81-4fe7-bc77-4c303eba011c", "bridge": "br-int", "label": "tempest-ServersTestJSON-444685636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "83044475bfd24b14a5a95b4b3fa0376c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "365ac5b1-6d83-4dfe-887f-60574d7f6124", "external-id": "nsx-vlan-transportzone-138", "segmentation_id": 138, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0a11299c-02", "ovs_interfaceid": "0a11299c-028b-4263-94f3-2ed325c36007", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2288.198931] env[67820]: DEBUG oslo_concurrency.lockutils [req-00eb480b-7a44-4daa-a276-0a97f130b5b9 req-5d3397d0-a80a-4a71-96ef-f93307db188e service nova] Releasing lock "refresh_cache-e0644c9e-0d5d-4a13-8a26-e99861454d1b" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2288.427912] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467520, 'name': CreateVM_Task, 'duration_secs': 0.284607} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2288.428131] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2288.428818] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2288.428977] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2288.429313] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2288.429581] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f4008a7f-c604-44b5-b5d5-7ae719335424 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2288.434264] env[67820]: DEBUG oslo_vmware.api [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Waiting for the task: (returnval){ [ 2288.434264] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]529120a7-9b40-1f19-d244-54c00a0c21b2" [ 2288.434264] env[67820]: _type = "Task" [ 2288.434264] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2288.443202] env[67820]: DEBUG oslo_vmware.api [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]529120a7-9b40-1f19-d244-54c00a0c21b2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2288.615931] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2288.944623] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2288.944892] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2288.945110] env[67820]: DEBUG oslo_concurrency.lockutils [None req-274ab0d3-a70c-4728-872a-d8d4276830cc tempest-ServersTestJSON-449910846 tempest-ServersTestJSON-449910846-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2289.621460] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2289.633352] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2289.633587] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2289.633752] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2289.633903] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2289.635045] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1630788c-9186-4714-b6e8-b57bd63dfa22 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.643673] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a95ceef0-659b-45b6-8500-873f9eadc757 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.657511] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73f60320-3f3f-4c10-bc46-384804b1d41d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.664703] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ea816d-160f-48e3-a0ae-dbe130241c6b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.694664] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180930MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2289.694829] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2289.695038] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2289.768760] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.768953] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769096] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769245] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769341] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769458] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eda6fcc3-b964-4728-a2e2-ece044b0ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769575] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance faa3fbe8-d076-422d-98ba-bfde42fb0580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769691] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a3a11e77-9a46-442d-84d0-09f08acbfc64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769803] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3e993ae-b433-4f41-9692-a90a835fc053 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.769915] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e0644c9e-0d5d-4a13-8a26-e99861454d1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2289.780868] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 13a5da5b-ac46-43ab-8b34-7aca76c1c059 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2289.781086] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2289.781236] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2289.909523] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77941ee9-3332-4442-9e0e-4807082c3672 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.918498] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae135f2c-5377-4e98-88b3-e78d47852365 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.946879] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78310472-f378-49d2-8fbb-8818d57f2683 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.953542] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-435e11d8-4a56-41c0-a282-046fd2a6baea {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2289.966148] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2289.974866] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2289.988870] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2289.989060] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.294s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2290.990066] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2290.990066] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2334.101819] env[67820]: WARNING oslo_vmware.rw_handles [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles response.begin() [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2334.101819] env[67820]: ERROR oslo_vmware.rw_handles [ 2334.102464] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Downloaded image file data 4407539e-b292-42b4-91c4-4faa60d48bab to vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2334.104261] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Caching image {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2334.104493] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Copying Virtual Disk [datastore1] vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk to [datastore1] vmware_temp/7a8479d9-3e13-46d8-9f13-bfe7435cf8ba/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk {{(pid=67820) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2334.104810] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-506bc834-86e9-46bb-889e-a332324ac114 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2334.114414] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for the task: (returnval){ [ 2334.114414] env[67820]: value = "task-3467521" [ 2334.114414] env[67820]: _type = "Task" [ 2334.114414] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2334.121838] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Task: {'id': task-3467521, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2334.625413] env[67820]: DEBUG oslo_vmware.exceptions [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Fault InvalidArgument not matched. {{(pid=67820) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2334.625697] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2334.626285] env[67820]: ERROR nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2334.626285] env[67820]: Faults: ['InvalidArgument'] [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Traceback (most recent call last): [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] yield resources [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self.driver.spawn(context, instance, image_meta, [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self._fetch_image_if_missing(context, vi) [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] image_cache(vi, tmp_image_ds_loc) [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] vm_util.copy_virtual_disk( [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] session._wait_for_task(vmdk_copy_task) [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] return self.wait_for_task(task_ref) [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] return evt.wait() [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] result = hub.switch() [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] return self.greenlet.switch() [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self.f(*self.args, **self.kw) [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] raise exceptions.translate_fault(task_info.error) [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Faults: ['InvalidArgument'] [ 2334.626285] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] [ 2334.627162] env[67820]: INFO nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Terminating instance [ 2334.628136] env[67820]: DEBUG oslo_concurrency.lockutils [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2334.628346] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2334.628590] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5db1c4a8-98fb-4acd-b7bb-5ecd3d6aa791 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2334.630728] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2334.630913] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2334.631703] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-397e04a0-9017-477e-ad46-9917ab90b3ab {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2334.638597] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Unregistering the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2334.638826] env[67820]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-98e0f7e8-39c0-4603-a5e0-d59ab05aec79 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2334.640820] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2334.640994] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67820) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2334.641900] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9374051c-6d6e-4f0d-80f0-2ace52cf4685 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2334.647216] env[67820]: DEBUG oslo_vmware.api [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Waiting for the task: (returnval){ [ 2334.647216] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]527b7497-adad-56dc-b960-1e9424f5157a" [ 2334.647216] env[67820]: _type = "Task" [ 2334.647216] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2334.659327] env[67820]: DEBUG oslo_vmware.api [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]527b7497-adad-56dc-b960-1e9424f5157a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2334.711066] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Unregistered the VM {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2334.711325] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Deleting contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2334.711511] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Deleting the datastore file [datastore1] bcb239dd-e793-43be-9f94-e53eb50e2f49 {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2334.711765] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b3341cda-0b4e-43c4-bc06-1ed718d106aa {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2334.717733] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for the task: (returnval){ [ 2334.717733] env[67820]: value = "task-3467523" [ 2334.717733] env[67820]: _type = "Task" [ 2334.717733] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2334.725249] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Task: {'id': task-3467523, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2335.157031] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Preparing fetch location {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2335.157378] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Creating directory with path [datastore1] vmware_temp/0e8a127f-8dac-4a75-ae69-296d57dcf70f/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2335.157565] env[67820]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0fbe715-ba04-468f-a486-958781b3fff1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.168595] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Created directory with path [datastore1] vmware_temp/0e8a127f-8dac-4a75-ae69-296d57dcf70f/4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2335.168770] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Fetch image to [datastore1] vmware_temp/0e8a127f-8dac-4a75-ae69-296d57dcf70f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2335.168943] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to [datastore1] vmware_temp/0e8a127f-8dac-4a75-ae69-296d57dcf70f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk on the data store datastore1 {{(pid=67820) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2335.169695] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c34232d-e00a-4537-8291-6ef5dba6e2a8 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.175969] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d4bba7e-8d1d-4fc6-a25c-d6daf3371584 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.185158] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9069f763-d652-4834-8ce5-e61e743ff705 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.214671] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9b8fdf0-cc25-44a5-a447-b988a8bc7b54 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.223363] env[67820]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-673a4ec1-2b09-4f3e-a1b9-e6c273738c0d {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.227403] env[67820]: DEBUG oslo_vmware.api [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Task: {'id': task-3467523, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065312} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2335.227896] env[67820]: DEBUG nova.virt.vmwareapi.ds_util [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Deleted the datastore file {{(pid=67820) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2335.228088] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Deleted contents of the VM from datastore datastore1 {{(pid=67820) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2335.228261] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2335.228427] env[67820]: INFO nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2335.230393] env[67820]: DEBUG nova.compute.claims [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Aborting claim: {{(pid=67820) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2335.230560] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2335.230766] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2335.247012] env[67820]: DEBUG nova.virt.vmwareapi.images [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Downloading image file data 4407539e-b292-42b4-91c4-4faa60d48bab to the data store datastore1 {{(pid=67820) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2335.295067] env[67820]: DEBUG oslo_vmware.rw_handles [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0e8a127f-8dac-4a75-ae69-296d57dcf70f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2335.353918] env[67820]: DEBUG oslo_vmware.rw_handles [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Completed reading data from the image iterator. {{(pid=67820) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2335.354254] env[67820]: DEBUG oslo_vmware.rw_handles [None req-97f70ebd-727a-4086-9ecc-6e64a376d6eb tempest-ServerAddressesTestJSON-769203959 tempest-ServerAddressesTestJSON-769203959-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0e8a127f-8dac-4a75-ae69-296d57dcf70f/4407539e-b292-42b4-91c4-4faa60d48bab/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67820) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2335.458955] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e752f196-749c-4ab1-9670-feb83b6e4324 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.466663] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6f5f5c2-c31b-42b8-b87b-a7df9786033b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.495944] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e86adf0e-2b7a-44e3-ab8f-1d080add08e3 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.502869] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fae4cad9-4158-4fbd-9b30-19750de2c709 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.516406] env[67820]: DEBUG nova.compute.provider_tree [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2335.524845] env[67820]: DEBUG nova.scheduler.client.report [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2335.539881] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.308s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2335.539881] env[67820]: ERROR nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2335.539881] env[67820]: Faults: ['InvalidArgument'] [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Traceback (most recent call last): [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self.driver.spawn(context, instance, image_meta, [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self._fetch_image_if_missing(context, vi) [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] image_cache(vi, tmp_image_ds_loc) [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] vm_util.copy_virtual_disk( [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] session._wait_for_task(vmdk_copy_task) [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] return self.wait_for_task(task_ref) [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] return evt.wait() [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] result = hub.switch() [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] return self.greenlet.switch() [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] self.f(*self.args, **self.kw) [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] raise exceptions.translate_fault(task_info.error) [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Faults: ['InvalidArgument'] [ 2335.539881] env[67820]: ERROR nova.compute.manager [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] [ 2335.539881] env[67820]: DEBUG nova.compute.utils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] VimFaultException {{(pid=67820) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2335.541034] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Build of instance bcb239dd-e793-43be-9f94-e53eb50e2f49 was re-scheduled: A specified parameter was not correct: fileType [ 2335.541034] env[67820]: Faults: ['InvalidArgument'] {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2335.541343] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Unplugging VIFs for instance {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2335.541516] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67820) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2335.541684] env[67820]: DEBUG nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2335.541844] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2335.820060] env[67820]: DEBUG nova.network.neutron [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2335.831430] env[67820]: INFO nova.compute.manager [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Took 0.29 seconds to deallocate network for instance. [ 2335.927549] env[67820]: INFO nova.scheduler.client.report [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Deleted allocations for instance bcb239dd-e793-43be-9f94-e53eb50e2f49 [ 2335.952380] env[67820]: DEBUG oslo_concurrency.lockutils [None req-07177926-ba2b-4b44-9a66-b2b55cb0eb67 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 675.698s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2335.953576] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 479.781s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2335.953793] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Acquiring lock "bcb239dd-e793-43be-9f94-e53eb50e2f49-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2335.954008] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2335.954201] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2335.956194] env[67820]: INFO nova.compute.manager [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Terminating instance [ 2335.957965] env[67820]: DEBUG nova.compute.manager [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Start destroying the instance on the hypervisor. {{(pid=67820) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2335.958172] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Destroying instance {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2335.958669] env[67820]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-24e58ce0-5897-4195-9f82-334e7fdd75e2 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2335.963831] env[67820]: DEBUG nova.compute.manager [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Starting instance... {{(pid=67820) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2335.970053] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7324403f-3429-4fbc-a3fa-83b38ccfe7d4 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.000250] env[67820]: WARNING nova.virt.vmwareapi.vmops [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bcb239dd-e793-43be-9f94-e53eb50e2f49 could not be found. [ 2336.000450] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Instance destroyed {{(pid=67820) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2336.000622] env[67820]: INFO nova.compute.manager [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2336.000888] env[67820]: DEBUG oslo.service.loopingcall [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2336.003113] env[67820]: DEBUG nova.compute.manager [-] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Deallocating network for instance {{(pid=67820) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2336.003245] env[67820]: DEBUG nova.network.neutron [-] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] deallocate_for_instance() {{(pid=67820) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2336.016654] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2336.016895] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2336.018389] env[67820]: INFO nova.compute.claims [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2336.029497] env[67820]: DEBUG nova.network.neutron [-] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Updating instance_info_cache with network_info: [] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2336.048025] env[67820]: INFO nova.compute.manager [-] [instance: bcb239dd-e793-43be-9f94-e53eb50e2f49] Took 0.04 seconds to deallocate network for instance. [ 2336.139660] env[67820]: DEBUG oslo_concurrency.lockutils [None req-0d61cc0c-cc84-459d-a8d4-df1a548d6fc3 tempest-AttachInterfacesTestJSON-1880309384 tempest-AttachInterfacesTestJSON-1880309384-project-member] Lock "bcb239dd-e793-43be-9f94-e53eb50e2f49" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.186s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2336.190300] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35c287d7-88a3-40af-a70a-e96b9c90687c {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.197641] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74c18774-59c7-4142-a73c-600b6697cecd {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.229030] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d24001b-4f58-4c75-bc1a-9fb53121bd59 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.235823] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-633494e1-f01d-4c53-8e00-d9cafaa638ec {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.248671] env[67820]: DEBUG nova.compute.provider_tree [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2336.257345] env[67820]: DEBUG nova.scheduler.client.report [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2336.271819] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2336.272315] env[67820]: DEBUG nova.compute.manager [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Start building networks asynchronously for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2336.304551] env[67820]: DEBUG nova.compute.utils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Using /dev/sd instead of None {{(pid=67820) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2336.306054] env[67820]: DEBUG nova.compute.manager [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Allocating IP information in the background. {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2336.306245] env[67820]: DEBUG nova.network.neutron [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] allocate_for_instance() {{(pid=67820) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2336.314886] env[67820]: DEBUG nova.compute.manager [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Start building block device mappings for instance. {{(pid=67820) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2336.362511] env[67820]: DEBUG nova.policy [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd81a07fc3b4c470c8f5a087d8825c5df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '890ffca423414cd69eca6a6bf4d1ac66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67820) authorize /opt/stack/nova/nova/policy.py:203}} [ 2336.378912] env[67820]: DEBUG nova.compute.manager [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Start spawning the instance on the hypervisor. {{(pid=67820) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2336.403921] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T20:42:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T20:41:48Z,direct_url=,disk_format='vmdk',id=4407539e-b292-42b4-91c4-4faa60d48bab,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0c9919c381ed4ae08ec1c6d27ce1eaac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T20:41:49Z,virtual_size=,visibility=), allow threads: False {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2336.404195] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2336.404388] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image limits 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2336.404585] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Flavor pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2336.404732] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Image pref 0:0:0 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2336.404876] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67820) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2336.405092] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2336.405256] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2336.405423] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Got 1 possible topologies {{(pid=67820) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2336.405584] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2336.405788] env[67820]: DEBUG nova.virt.hardware [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67820) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2336.406672] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5f54d53-38ff-4ab1-9ccf-64cfd6285a3a {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.414795] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1126f42-80ef-4ae8-89ed-315567b52383 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.702736] env[67820]: DEBUG nova.network.neutron [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Successfully created port: 05e151c0-3f1c-4658-b8ae-bd20943e1457 {{(pid=67820) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2337.256411] env[67820]: DEBUG nova.network.neutron [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Successfully updated port: 05e151c0-3f1c-4658-b8ae-bd20943e1457 {{(pid=67820) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2337.269373] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "refresh_cache-13a5da5b-ac46-43ab-8b34-7aca76c1c059" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2337.269528] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "refresh_cache-13a5da5b-ac46-43ab-8b34-7aca76c1c059" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2337.269674] env[67820]: DEBUG nova.network.neutron [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Building network info cache for instance {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2337.305550] env[67820]: DEBUG nova.network.neutron [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Instance cache missing network info. {{(pid=67820) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2337.498770] env[67820]: DEBUG nova.network.neutron [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Updating instance_info_cache with network_info: [{"id": "05e151c0-3f1c-4658-b8ae-bd20943e1457", "address": "fa:16:3e:a1:16:2f", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05e151c0-3f", "ovs_interfaceid": "05e151c0-3f1c-4658-b8ae-bd20943e1457", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2337.512364] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "refresh_cache-13a5da5b-ac46-43ab-8b34-7aca76c1c059" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2337.512650] env[67820]: DEBUG nova.compute.manager [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Instance network_info: |[{"id": "05e151c0-3f1c-4658-b8ae-bd20943e1457", "address": "fa:16:3e:a1:16:2f", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05e151c0-3f", "ovs_interfaceid": "05e151c0-3f1c-4658-b8ae-bd20943e1457", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67820) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2337.513083] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a1:16:2f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db00ec2e-3155-46b6-8170-082f7d86dbe7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '05e151c0-3f1c-4658-b8ae-bd20943e1457', 'vif_model': 'vmxnet3'}] {{(pid=67820) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2337.520893] env[67820]: DEBUG oslo.service.loopingcall [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67820) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2337.521361] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Creating VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2337.521600] env[67820]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1c2567c1-249d-4a80-b79a-4347c6b30a62 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2337.543299] env[67820]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2337.543299] env[67820]: value = "task-3467524" [ 2337.543299] env[67820]: _type = "Task" [ 2337.543299] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2337.556606] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467524, 'name': CreateVM_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2337.883684] env[67820]: DEBUG nova.compute.manager [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Received event network-vif-plugged-05e151c0-3f1c-4658-b8ae-bd20943e1457 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2337.883940] env[67820]: DEBUG oslo_concurrency.lockutils [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] Acquiring lock "13a5da5b-ac46-43ab-8b34-7aca76c1c059-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2337.884121] env[67820]: DEBUG oslo_concurrency.lockutils [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] Lock "13a5da5b-ac46-43ab-8b34-7aca76c1c059-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2337.884294] env[67820]: DEBUG oslo_concurrency.lockutils [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] Lock "13a5da5b-ac46-43ab-8b34-7aca76c1c059-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2337.884455] env[67820]: DEBUG nova.compute.manager [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] No waiting events found dispatching network-vif-plugged-05e151c0-3f1c-4658-b8ae-bd20943e1457 {{(pid=67820) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2337.884613] env[67820]: WARNING nova.compute.manager [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Received unexpected event network-vif-plugged-05e151c0-3f1c-4658-b8ae-bd20943e1457 for instance with vm_state building and task_state spawning. [ 2337.884764] env[67820]: DEBUG nova.compute.manager [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Received event network-changed-05e151c0-3f1c-4658-b8ae-bd20943e1457 {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2337.884912] env[67820]: DEBUG nova.compute.manager [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Refreshing instance network info cache due to event network-changed-05e151c0-3f1c-4658-b8ae-bd20943e1457. {{(pid=67820) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2337.885100] env[67820]: DEBUG oslo_concurrency.lockutils [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] Acquiring lock "refresh_cache-13a5da5b-ac46-43ab-8b34-7aca76c1c059" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2337.885239] env[67820]: DEBUG oslo_concurrency.lockutils [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] Acquired lock "refresh_cache-13a5da5b-ac46-43ab-8b34-7aca76c1c059" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2337.885391] env[67820]: DEBUG nova.network.neutron [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Refreshing network info cache for port 05e151c0-3f1c-4658-b8ae-bd20943e1457 {{(pid=67820) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2338.054336] env[67820]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467524, 'name': CreateVM_Task, 'duration_secs': 0.294145} completed successfully. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2338.054579] env[67820]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Created VM on the ESX host {{(pid=67820) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2338.055447] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2338.055721] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2338.056127] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2338.056397] env[67820]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cd2b01e-85bf-47b9-bbe2-0b200cfb9a6e {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2338.061512] env[67820]: DEBUG oslo_vmware.api [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Waiting for the task: (returnval){ [ 2338.061512] env[67820]: value = "session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d97394-724e-9c86-b326-bbfae3dde78c" [ 2338.061512] env[67820]: _type = "Task" [ 2338.061512] env[67820]: } to complete. {{(pid=67820) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2338.068970] env[67820]: DEBUG oslo_vmware.api [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Task: {'id': session[52fa9794-d32e-c496-0a13-09ee307dfa03]52d97394-724e-9c86-b326-bbfae3dde78c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67820) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2338.124855] env[67820]: DEBUG nova.network.neutron [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Updated VIF entry in instance network info cache for port 05e151c0-3f1c-4658-b8ae-bd20943e1457. {{(pid=67820) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2338.125211] env[67820]: DEBUG nova.network.neutron [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Updating instance_info_cache with network_info: [{"id": "05e151c0-3f1c-4658-b8ae-bd20943e1457", "address": "fa:16:3e:a1:16:2f", "network": {"id": "7f8b6959-9b69-4d7a-827f-b3362e5f9199", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-995601471-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "890ffca423414cd69eca6a6bf4d1ac66", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap05e151c0-3f", "ovs_interfaceid": "05e151c0-3f1c-4658-b8ae-bd20943e1457", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67820) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2338.134457] env[67820]: DEBUG oslo_concurrency.lockutils [req-5cb83c98-271e-4059-910f-f4cfc916e80d req-3e4ea29f-4cb0-46c4-a2b8-7e9221f9f2b5 service nova] Releasing lock "refresh_cache-13a5da5b-ac46-43ab-8b34-7aca76c1c059" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2338.572411] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2338.572705] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Processing image 4407539e-b292-42b4-91c4-4faa60d48bab {{(pid=67820) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2338.572857] env[67820]: DEBUG oslo_concurrency.lockutils [None req-806a2adb-9963-4ce2-9b62-e8047003506b tempest-DeleteServersTestJSON-1163079554 tempest-DeleteServersTestJSON-1163079554-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4407539e-b292-42b4-91c4-4faa60d48bab/4407539e-b292-42b4-91c4-4faa60d48bab.vmdk" {{(pid=67820) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2339.584364] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.584795] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Getting list of instances from cluster (obj){ [ 2339.584795] env[67820]: value = "domain-c8" [ 2339.584795] env[67820]: _type = "ClusterComputeResource" [ 2339.584795] env[67820]: } {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2339.585845] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c24d09-a59b-40c7-8950-c7328328c171 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2339.602332] env[67820]: DEBUG nova.virt.vmwareapi.vmops [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Got total of 10 instances {{(pid=67820) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2339.621466] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.643259] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.643442] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2339.651228] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] There are 0 instances to clean {{(pid=67820) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2343.629775] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.630152] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Starting heal instance info cache {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2343.630152] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Rebuilding the list of instances to heal {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2343.649766] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: fffda39c-1960-49f9-a26b-6b87e2c3c53e] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.649906] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eb759eb8-e670-4b9b-a0e0-865bdd53a208] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650042] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650178] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 1068e5cc-2514-4e07-aeee-e7e64c95a979] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650302] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: eda6fcc3-b964-4728-a2e2-ece044b0ffa2] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650422] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: faa3fbe8-d076-422d-98ba-bfde42fb0580] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650543] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: a3a11e77-9a46-442d-84d0-09f08acbfc64] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650661] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: d3e993ae-b433-4f41-9692-a90a835fc053] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650807] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: e0644c9e-0d5d-4a13-8a26-e99861454d1b] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.650941] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] [instance: 13a5da5b-ac46-43ab-8b34-7aca76c1c059] Skipping network cache update for instance because it is Building. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2343.651075] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Didn't find any instances for network info cache update. {{(pid=67820) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2343.651555] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2345.621593] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2345.621953] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67820) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2347.622135] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2348.616562] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2348.621258] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2349.622396] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2350.629032] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2350.629418] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2350.629418] env[67820]: DEBUG nova.compute.manager [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Cleaning up deleted instances with incomplete migration {{(pid=67820) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2351.632073] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2351.632073] env[67820]: DEBUG oslo_service.periodic_task [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67820) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2351.642770] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2351.642980] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2351.643163] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2351.643355] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67820) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2351.644507] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb2764a4-9e04-44e9-9519-0350a0db53df {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2351.653406] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcbc10e5-2bf9-47b2-b503-3f9076db27ed {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2351.667177] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3bbbf9c-e36d-4591-a0f2-4dc7be552deb {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2351.673045] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9beb33eb-2ab4-4ff2-987d-fb08957f8242 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2351.701093] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180926MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=67820) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2351.701231] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2351.701416] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2351.802308] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance fffda39c-1960-49f9-a26b-6b87e2c3c53e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.802487] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eb759eb8-e670-4b9b-a0e0-865bdd53a208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.802621] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 4fb7ac00-ff06-4cf0-8a5e-41c76a390b38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.802744] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 1068e5cc-2514-4e07-aeee-e7e64c95a979 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.802865] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance eda6fcc3-b964-4728-a2e2-ece044b0ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.802983] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance faa3fbe8-d076-422d-98ba-bfde42fb0580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.803122] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance a3a11e77-9a46-442d-84d0-09f08acbfc64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.803293] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance d3e993ae-b433-4f41-9692-a90a835fc053 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.803401] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance e0644c9e-0d5d-4a13-8a26-e99861454d1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.803520] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Instance 13a5da5b-ac46-43ab-8b34-7aca76c1c059 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67820) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2351.803719] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2351.803859] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67820) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2351.819210] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing inventories for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2351.833833] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating ProviderTree inventory for provider 0f792661-ec04-4fc2-898f-e9860339eddd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2351.834237] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Updating inventory in ProviderTree for provider 0f792661-ec04-4fc2-898f-e9860339eddd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2351.844755] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing aggregate associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, aggregates: None {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2351.861153] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Refreshing trait associations for resource provider 0f792661-ec04-4fc2-898f-e9860339eddd, traits: COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=67820) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2351.971427] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb3bac1-15e3-41ad-a9e9-8b9cbd018176 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2351.978881] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab7dc285-8496-4513-b5bc-848c1a84d806 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.008839] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a07e7710-f351-488a-a8fe-c1320e8e651b {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.015927] env[67820]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aacb442c-9ed2-49b9-a9aa-fef01dfbc5f1 {{(pid=67820) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.028787] env[67820]: DEBUG nova.compute.provider_tree [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed in ProviderTree for provider: 0f792661-ec04-4fc2-898f-e9860339eddd {{(pid=67820) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2352.038358] env[67820]: DEBUG nova.scheduler.client.report [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Inventory has not changed for provider 0f792661-ec04-4fc2-898f-e9860339eddd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67820) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2352.053397] env[67820]: DEBUG nova.compute.resource_tracker [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67820) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2352.053581] env[67820]: DEBUG oslo_concurrency.lockutils [None req-c05cf225-7786-4d7c-9053-88f50c0b0489 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s {{(pid=67820) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}}